Article

Optimization under uncertainty: State-of-the-art and opportunities

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Uncertainty, for instance, governs the prices of fuels, the availability of electricity, and the demand for chemicals. A key difficulty in optimization under uncertainty is in dealing with an uncertainty space that is huge and frequently leads to very large-scale optimization models. Decision-making under uncertainty is often further complicated by the presence of integer decision variables to model logical and other discrete decisions in a multi-period or multi-stage setting.This paper reviews theory and methodology that have been developed to cope with the complexity of optimization problems under uncertainty. We discuss and contrast the classical recourse-based stochastic programming, robust stochastic programming, probabilistic (chance-constraint) programming, fuzzy programming, and stochastic dynamic programming. The advantages and shortcomings of these models are reviewed and illustrated through examples. Applications and the state-of-the-art in computations are also reviewed.Finally, we discuss several main areas for future development in this field. These include development of polynomial-time approximation schemes for multi-stage stochastic programs and the application of global optimization algorithms to two-stage and chance-constraint formulations.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Different classifications for uncertainties have been proposed [59,63,64]. The importance of classifying different types of uncertainties is mainly exemplified by facilitating the selection of different methods to treat particular groups of uncertainties. ...
... The second set is random, since it depends on the realisation of uncertain variables. Therefore, the aim is to choose the first set to optimise a given measure and the expected value of the same measure for the second set [63]. ...
... However, there is no information about probabilities of variables assuming particular values inside the interval, which has to be taken into consideration, since it could oversimplify the problem and lose important information. Finally, the ranges of the variables can be used to obtain worst-case scenarios [63]. ...
Thesis
The manufacturing process of photovoltaic devices, such as solar cells, relies on the production of Transparent and Conductive Oxide (TCO) films. One of the techniques for creating these films is based on Aerosol-Assisted Chemical Vapour Deposition (AACVD). The AACVD process comprises the atomisation of a precursor solution into aerosol droplets, which are transported to a heated chamber for the synthesis of films such as the TCOs, as well as coatings, powders, composites and nanotubes. At present, AACVD has not been used as an industrial deposition technique. However, it has the potential to be scaled-up due to its versatility and the ease through which effective functional coatings can be deposited at a laboratory-scale. Computational simulations are pivotal to study the feasibility of such a scale-up. This thesis presents, therefore, an integrated model to support the AACVD process scale-up. The model is comprised of four stages: aerosol generation, transport, delivery and chemical deposition. The generation of aerosol is described by a distribution of droplet sizes, which is the input to a transport model that incorporates the impact of aerosol losses. The output distribution provides sufficient information to predict the amount and sizing of aerosol reaching the deposition site. Experimental validation has shown the model to be effective at predicting transport losses and droplet sizes. The delivery stage includes the solvent evaporation, accounting for uncertainties in the temperature profile of the deposition site. This is a key factor for the solvent evaporation, setting the precursors free to react and form the desired products. For the chemical deposition stage, reactions in the solid and gas phases were studied. The model presented is suitable for application on the scale of industrial processes and is also suitable for processes that rely on atomisation and transport of particles, for example, spray drying or cooling and fuel combustion. Lessons learned in modelling uncertainties and their impact on process scale-up motivated the research into formulation, modelling and solution methods for such applications. Therefore, as an additional contribution, this thesis introduces Uncertainty.jl, a modelling framework focused on the treatment of uncertainty.
... There are different kinds of methodologies in the literature to investigate the uncertainty. Some of them are fuzzy programming, stochastic programming, as well as a robust optimization approach [12][13] . Altuntas et al. [14] proposed fuzzy weighted association rules so as to handle the facility layout problem in the CMS area. ...
... Based on Constraint (11), an operator with a minimum skill value can be selected to work with a machine tool. The operator learning-forgetting effect is regarded, according to Eq. (12) . Based on these two constraints, the operator's skill level must be updated in each production period. ...
Article
One of the most critical issues in manufacturing systems is the operator management. In this paper, the operator assignment problem is studied within a cellular manufacturing system. The most important novelty of this research is the consideration of operator learning and forgetting effects simultaneously. The skill level of an operator can be increased/decreased based on the time spent on a machine. Moreover, the issues related to operators like hiring, firing, and salaries are considered in the proposed model. The parameters are considered to be uncertain in this model, and a robust optimization approach is developed to handle it. Using this approach, the model solution remains feasible (or even optimal) for different levels of parameter uncertainty. To verify and validate the proposed model, some numerical instances are randomly generated and solved using GAMS. A statistical analysis is also conducted on the results of the objective function values of linear and nonlinear models, followed by some managerial insights. Download link until May 10, 2022: https://authors.elsevier.com/a/1enKq,703qAdJw
... We refer the reader to the survey papers [12] and [39] for more comprehensive overviews of stochastic optimization. ...
Preprint
Full-text available
Optimization equips engineers and scientists in a variety of fields with the ability to transcribe their problems into a generic formulation and receive optimal solutions with relative ease. Industries ranging from aerospace to robotics continue to benefit from advancements in optimization theory and the associated algorithmic developments. Nowadays, optimization is used in real time on autonomous systems acting in safety critical situations, such as self-driving vehicles. It has become increasingly more important to produce robust solutions by incorporating uncertainty into optimization programs. This paper provides a short survey about the state of the art in optimization under uncertainty. The paper begins with a brief overview of the main classes of optimization without uncertainty. The rest of the paper focuses on the different methods for handling both aleatoric and epistemic uncertainty. Many of the applications discussed in this paper are within the domain of control. The goal of this survey paper is to briefly touch upon the state of the art in a variety of different methods and refer the reader to other literature for more in-depth treatments of the topics discussed here.
... Despite all the advancement of optimization methodology and solvers, realistic and practical problem-solving can be difficult as the inclusion of all existing constraints can create enormous resource requirements to build and maintain models (Honkomp et al., 2000). Further, representing data uncertainty within optimization models can be extremely difficult based on the lack of reported statistical distributions in data sources and the implementation of appropriate optimization approaches that incorporate those distributions (Sahinidis, 2004;Balasubramanian and Grossmann, 2003;Wang et al., 2018;Li and Ierapetritou, 2008). ...
Article
With increased understanding of the impacts of climate change, institutions across the world are committing to achieve carbon neutrality and other emissions reduction targets. Within academia, the American College & University Presidents’ Climate Commitment was created as a highly visible public contract with over 300 active signatories. Existing climate action planning tools rely on significant subjectivity and previously conducted project analysis. This work presents an optimization framework that develops a long-term planning tool tailored for use by academic institutions. Further, we highlight the potential for ecological solutions and land management as economically competitive actions for campuses. The presented tool, known as DCARB (Designing Climate Action and Regulations for sustainaBility), features an ability to quickly optimize multiple pathways towards carbon neutrality based on user-input and parameter variability. We showcase DCARB through an application for The Ohio State University to demonstrate the outputs of the tool based on current projects outlined in the institution’s climate action plan. Additionally, parameter variability is analyzed to explore how solutions might change due to data uncertainty or application to other campuses. The results encourage urgent action as they can provide cheaper pathways towards neutrality, including immediate consideration of land-based actions. Additionally, applications of the tool highlight challenges for achieving neutrality such as a reliance on market-based offsets, future market volatility, proper carbon credit accounting, and planning for future social and behavioral actions.
... Typical approaches used for uncertainty optimization include stochastic programming, robust optimization, chance constrained optimization and distributionally robust optimization (Li and Ierapetritou, 2008;Sahinidis, 2004). ...
Thesis
According to the long term air traffic forecasts done by International Civil Aviation Organization (ICAO) in 2018, global passenger traffic is expected to grow by 4.2% annually from 2018 to 2038 using the traffic data of 2018 as a baseline. Even though the outbreak of COVID-19 has caused a huge impact on the air transportation, it is gradually restoring. Considering the potential demand in future, air traffic efficiency and safety will remain critical issues to be considered. In the airspace system, the runway is the main bottleneck in the aviation chain. Moreover, in the domain of air traffic management, the Terminal Maneuvering Area (TMA) is one of the most complex areas with all arrivals converging to land. This motivates the development of suitable decision support tools for providing proper advisories for arrival management. In this thesis, we propose two optimization approaches that aim to provide suitable control solutions for arrival management in the TMA and in the extended horizon that includes the TMA and the enroute phase. In the first part of this thesis, we address the aircraft scheduling problem under uncertainty in the TMA. Uncertainty quantification and propagation along the routes are realized in a trajectory model that formulates the time information as random variables. Conflict detection and resolution are performed at waypoints of a predefined network based on the predicted time information from the trajectory model. By minimizing the expected number of conflicts, consecutively operated flights can be well separated. Apart from the proposed model, two other models - the deterministic model and the model that incorporates separation buffers - are presented as benchmarks. Simulated annealing (SA) combined with the time decomposition sliding window approach is used for solving a case study of the Paris Charles de Gaulle (CDG) airport. Further, a simulation framework based on the Monte-Carlo approach is implemented to randomly perturb the optimized schedules of the three models so as to evaluate their performances. Statistical results show that the proposed model has absolute advantages in conflict absorption when uncertainty arises. In the second part of this thesis, we address a dynamic/on-line problem based on the concept of Extended Arrival MANagement (E-AMAN). The E-AMAN horizon is extended up to 500NM from the destination airport so as to enhance the cooperation and situational awareness of the upstream sector control and the TMA control. The dynamic feature is addressed by periodically updating the real aircraft trajectory information based on the rolling horizon approach. For each time horizon, a sub-problem is established taking the weighted sum of safety metrics in the enroute segment and in the TMA as objective. A dynamic weights assignment approach is proposed to emphasize the fact that as an aircraft gets closer to the TMA, the weight for its metrics associated with the TMA should increase. A case study is carried out using the real arrival traffic data of the Paris CDG airport. Final results show that through early adjustment, the arrival time of the aircraft can meet the required schedule for entering the TMA, thus ensuring overall safety and reducing holding time. In the third part of this thesis, an algorithm that expedites the optimization process is proposed. Instead of evaluating the performance of all aircraft, single aircraft performance is focused and a corresponding objective function is created. Through this change, the optimization process benefits from fast evaluation of objective and high convergence speed. In order to verify the proposed algorithm, results are analyzed in terms of execution time and quality of result compared to the originally used algorithm.
... Although all the levels are important, this paper focus on the forth level, on the sizing of the components for a given technology and an already-fixed topology, with the aim to equip the design engineer with a methodology to optimize the design when uncertainty is present, also known as optimal robust design (Zang et al., 2005). In particular, robust optimization with deterministic uncertainty (Goerigk and Schöbel, 2016) will be considered, instead of the more conventional approaches using stochastic uncertainty (Sahinidis, 2004;Hamzaçebi, 2020), to make the method more realistic and better suited for the design of new product for which stochastic information is typically not yet available. It is well know that robust optimization with deterministic uncertainty suffer from computational tractability for which reformulations are required (Gorissen et al., 2015). ...
Article
Full-text available
This paper addresses the problem of finding a robust optimal design when uncertain parameters in the form of crisp or interval sets are present in the optimization. Furthermore, in order to make the approach as general as possible, direct search methods with the help of sensitivity analysis techniques are employed to optimize the design. Consequently, the presented approach is suitable for black box models for which no, or very little, information of the equations governing the model is available. The design of an electric drivetrain is used to illustrate the benefits of the proposed method.
... Uncertainty-Based Design Optimization (UDO) is an ongoing research topic to tackle uncertainties in design variables, parameters, and optimization systems, which can be traced back to the 1950 s [21,22]. Most existing literature on this topic focuses on the design and optimization algorithms under uncertainties that are quantified and propagated [23][24][25][26][27]. Successful applications have been observed in the civil engineering industry [28][29][30]. ...
Article
Full-text available
State-of-the-art computational design approaches for designing the assembly of a structure often rely on high-precision building objects, which can be difficult and costly to fabricate. To consider a wider range of construction materials and fabrication processes, e.g. low-cost 3D printing and brick making, this research introduces a novel computational design method for fabrication-aware assembly problems using existing building components with dimensional variations. Given a finite set of building objects, this study develops a statistical preprocessing approach and a modified pattern search with an efficient integer linear programming method to generate the assembly of a target structure, such that it is statically stable and optimized for geometric smoothness and space utilization. In two case studies, the proposed algorithm exhibits significantly better boundary fitting and space utilization compared with a genetic algorithm and a standard pattern search method and lower computational cost compared with a genetic algorithm.
... L'optimisation est utilisée en grande partie dans les problématiques de productivité surtout lorsque l'incertitude est faible. En effet, l'optimisation rencontre des problèmes de gestion d'incertitude (Sahinidis, 2004). La nature déterministe de l'optimisation empêche la possibilité d'introduire des incertitudes et des variabilités au modèle (du moins de manière plus facile que la simulation). ...
Thesis
Depuis de nombreuses années, l’utilisation de machines en industrie est réglementée pour respecter des normes de sécurité et protéger les utilisateurs. Il s’agit tout autant de mesures de sécurité que de mesures d’ergonomie, l’objectif étant de protéger les travailleurs à long terme comme à court terme. De trop nombreux accidents du travail ou problèmes d’ergonomie sont observés sur les postes de travail. En effet, en France en 2013, près de 618263 cas d’accidents du travail entraînant un arrêt, 541 décès causés par des accidents du travail et 430 causés par des maladies professionnelles ont été recensés par la sécurité sociale [INRS 2014]. Les Troubles Musculo-Squelettiques ou les TMS, représentent aujourd’hui plus de 2/3 des maladies professionnelles reconnues en France et figurent parmi les six premières maladies professionnelles reconnues dans toute l’Europe. Pour lutter contre les TMS, l'intégration de marges de manœuvres dynamiques dans la conception d'un système de production est important pour assurer une bonne performance de production, tout en prenant en compte les facteurs humains tels que la fatigue et l'expérience.
... Therefore, the use of a single parameter set to make an important model prediction should be avoided because this practice does not reflect the degree of parameter and predictive uncertainty inherent in most modeling contexts ( Methods and frameworks that can quantify, explain, and reduce uncertainty of GHG prediction in estuarine environment are therefore needed. The use of advanced techniques in the field of environmental model sensitivity analysis and uncertainty quantification is emerging, with potential to maximize the utility of available data (Dietzel & Reichert, 2012), quantify the predictive uncertainty (Doherty, 2015;Sahinidis, 2004), and inform the sensitivity to model parameters and forcing data (Pianosi et al., 2016). However, to date these techniques have not been applied to the estuarine CO 2 and CH 4 models or for characterizing the nature of estuary GHG emissions. ...
Article
Full-text available
Estuaries make an important contribution to the global greenhouse gas budget. Yet modeling predictions of carbon dioxide (CO2) and methane (CH4) emissions from estuaries remain highly uncertain due to both simplified assumptions about the underpinning hydrologic and biologic processes and inadequate data availability to uniquely define parameters related to CO2 and CH4 processes. This study presents a modeling framework to quantify the sensitivity and uncertainty of predicted CO2 and CH4 concentrations and emissions, which is demonstrated through application to a subtropical urban estuary (Brisbane River, Australia). A 3D hydrodynamic‐biogeochemical model was constructed, and calibrated using the model‐independent Parameter ESTimation software (PEST) with field data sets that captured strong gradients of CO2 and CH4 concentrations and emissions along the estuary. The approach refined uncertainty in the estimation of whole‐estuary annual emissions, and enabled us to assess the sensitivity and uncertainty of CO2 and CH4 dynamics. Estuarine CO2 concentrations were most sensitive to uncertainty in riverine inputs, whereas estuarine CH4 concentrations were most sensitive to sediment production and pelagic oxidation. Over the modeled year, variance in the daily fluctuations in carbon emissions from this case‐study spanned the full range of emission rates reported for estuaries around the world, highlighting that spatially or temporally limited sampling regimes could significantly bias estuarine greenhouse gas emission estimates. The combination of targeted field campaigns with the modeling approach presented in this study can help to improve carbon budgeting in estuaries, reduce uncertainty in emission estimates, and support management strategies to reduce or offset estuary greenhouse gas emissions.
... Typically, stochastic programs are based on scenarios, e.g., possible realizations of renewable production trajectories (Conejo et al., 2010;Morales et al., 2013;Chen et al., 2018a). The PSE community has been on the forefront of finding solutions to scheduling problems and stochastic programs for decades (Grossmann and Sargent, 1978;Halemane and Grossmann, 1983;Pistikopoulos and Ierapetritou, 1995;Sahinidis, 2004). Thus, energy system scheduling problems are solved successfully by the PSE community Schäfer et al., 2019Schäfer et al., , 2020. ...
Preprint
Full-text available
We present a specialized scenario generation method that utilizes forecast information to generate scenarios for the particular usage in day-ahead scheduling problems. In particular, we use normalizing flows to generate wind power generation scenarios by sampling from a conditional distribution that uses day-ahead wind speed forecasts to tailor the scenarios to the specific day. We apply the generated scenarios in a simple stochastic day-ahead bidding problem of a wind electricity producer and run a statistical analysis focusing on whether the scenarios yield profitable and reliable decisions. Compared to conditional scenarios generated from Gaussian copulas and Wasserstein-generative adversarial networks, the normalizing flow scenarios identify the daily trends more accurately and with a lower spread while maintaining a diverse variety. In the stochastic day-ahead bidding problem, the conditional scenarios from all methods lead to significantly more profitable and reliable results compared to an unconditional selection of historical scenarios. The obtained profits using the normalizing flow scenarios are consistently closest to the perfect foresight solution, in particular, for small sets of only five scenarios.
... Modeling complex process systems has long been of interest in industry and the research community (Sahinidis, 2004). As a result, commercial simulation software has been developed, enabling engineers to quickly evaluate process designs without the capital costs associated with pilot studies (Cozad et al., 2014). ...
Chapter
Gaussian Processes present a versatile surrogate modeling toolbox to address simulation-based optimization and uncertainties arising from non-converged simulations. In this work we present a black-box optimization methodology framework in which Gaussian Process Regression is used to model complex underlying process performance models and Gaussian Process Classification is used to model feasibility constraints based on converged and non-converged simulations. Additionally, we present a conservativeness parameter to enable tuning of the feasible region based on the trade-off between process performance and the risk of infeasibility due to non-converged simulations.
... However, stochastic models are not conventional in biomass supply chain management because they are tough to be solved optimally and to guarantee constraint feasibility for all considered scenarios. According to Sahinidis [2004], there are three principal approaches to cope with uncertainty: stochastic programming (recourse models, robust stochastic programming, and probabilistic models), fuzzy programming (flexible and possibilistic programming), and stochastic dynamic programming. ...
Thesis
This thesis addresses the optimization of the production planning problem under energy availability constraints. The goal of the thesis is to develop appropriate mathematical models and optimization methods that can solve the handled problem. In the first part, Capacitated Single-Item Lot-Sizing Problem for a Flow-Shop System is studied by taking into account several energy sources (renewable and non-renewable). The aim is to minimize production, stock and energy costs by considering both the consumption and the mix of energy sources in the energy contract that the company subscribes from the energy supplier. This problem has never been addressed in the literature. In the second part, the same problem is handled with the consideration of the stochastic aspect of renewable energy sources. The motivation behind this attempt is to provide a decision making system to the manufacturers who wish to use renewable energy whose availability is not guaranteed. To model the uncertainty of the renewable energy sources, several probabilistic constraints are proposed. This study is the first attempt that applies the probabilistic constraints to model the uncertain availability of the renewable energy sources
... However, stochastic models are not conventional in biomass supply chain management because they are tough to be solved optimally and to guarantee constraint feasibility for all considered scenarios. According to Sahinidis [2004], there are three principal approaches to cope with uncertainty: stochastic programming (recourse models, robust stochastic programming, and probabilistic models), fuzzy programming (flexible and possibilistic programming), and stochastic dynamic programming. ...
Thesis
Full-text available
In the last decades, the use of renewable energy sources reduced the effect of global warming. Biomass is a promising energy resource to produce heat, electricity, and biofuel. An efficient supply chain design and optimal inventory management could contribute to reduce biofuel prices significantly and improve competitiveness with fossil fuel. This thesis addresses two crucial problems: biomass supply chain optimization and inventory control for a perishable product (biomass). The former is devoted to an issue of supplier selection and operation planning in biomass supply chains under uncertainty. This problem is formulated as a deterministic (MILP) model and a two-stage stochastic programming model. The deterministic model is solved by using a MIP solver GUROBI. An enhanced L-shaped decomposition method is developed to find an optimal solution for the stochastic model. The second deals with a stochastic inventory problem of a perishable product with uncertainty in both supply and demand. After demonstrating its optimal inventory policy is an order-to-level policy, a Lagrangian relaxation based algorithm is developed to quickly find a near-optimal solution of the problem. The stochastic inventory problem is, then, extended to a product with a fixed lifetime. The Conditional scenario method is developed to solve approximately this problem.
... The ILP method is used to solve the model with "unknown but bounded" uncertainties (21). The 13 ...
Conference Paper
Full-text available
The uncertainty in the vessel arrival time is identified as one of the major factors to disrupt the operation plan of the berth allocation and quay crane assignment problem (BACAP). To tackle the uncertainty, the authors integrate the vessel arrival time prediction into the optimization process of the BACAP and put forward an improved BACAP-VATP model. The vessel arrival time prediction provides an approximated estimate of the deviation between the ETA and ATA, which is in the form of an interval with only the upper and lower bounds known. Thus, when integrated with the vessel arrival time prediction, the proposed BACAP-VATP model includes an interval uncertainty that needs to be solved. A robust optimization approach based on interval linear programming (ILP) is applied to transform the proposed BACAP-VATP model with interval uncertainty into a deterministic multi-objective optimization model solved by an improved genetic algorithm (NSGA-II). By a case study conducted for a container terminal of Ningbo-Zhoushan Port, China, the performance of the proposed BACAP-VATP model is assessed based on the price of robustness and the reliability value. Compared to the BACAP model with inserting buffer times to absorb the uncertainty in the vessel arrival time, the proposed BACAP-VATP model has a more reliable performance with the uncertainty in the vessel arrival time without significantly declining the operational efficiency of the baseline schedule.
... Considering uncertainty is becoming an increasingly important facet of solution approaches [14,15] given the increasingly competitive and distributed nature of global markets. The simulation community has demonstrated strong benefits in handling uncertainty [16,17,18]. ...
Preprint
Full-text available
Reinforcement Learning (RL) has recently received significant attention from the process systems engineering and control communities. Recent works have investigated the application of RL to identify optimal scheduling decision in the presence of uncertainty. In this work, we present a RL methodology to address precedence and disjunctive constraints as commonly imposed on production scheduling problems. This work naturally enables the optimization of risk-sensitive formulations such as the conditional value-at-risk (CVaR), which are essential in realistic scheduling processes. The proposed strategy is investigated thoroughly in a single-stage, parallel batch production environment, and benchmarked against mixed integer linear programming (MILP) strategies. We show that the policy identified by our approach is able to account for plant uncertainties in online decision-making, with expected performance comparable to existing MILP methods. Additionally, the framework gains the benefits of optimizing for risk-sensitive measures, and identifies decisions orders of magnitude faster than the most efficient optimization approaches. This promises to mitigate practical issues and ease in handling realizations of process uncertainty in the paradigm of online production scheduling.
...  L'approche stochastique qui est fréquemment utilisée dans le but de résoudre des problèmes de planification dans un contexte incertain [Mula et al., 2006] [Peidro et al., 2009], et qui se base sur la programmation stochastique [Birge et Louveaux, 1997], la programmation floue [Zimmermann, 1991], l'optimisation robuste [Leung et al., 2006], et la programmation dynamique stochastique [Cristobal et al., 2009]. Ces quatre approches sont les plus utilisées lors de la modélisation de l'incertitude dans les problèmes de planification de la production [Sahinidis, 2004]. ...
Thesis
Dans un contexte de forte concurrence internationale, et face à une forte croissance de la demande mondiale des fertilisants, une demande ainsi caractérisée par une grande diversité des produits, les entreprises soucieuses de préserver contre vents et marées leurs équilibres physiques, économiques et financiers, se trouvent largement exposées à de nombreux problèmes dus aux exigences des clients et le nouveau contexte compétitif renforcé par la mondialisation. C’est dans ce contexte tendu que ces entreprises confrontées à de multiples défis, cherchent à adopter d’urgence des plans d’ajustement structurel. Plus vite elles s’engageront sur le chemin de l’innovation et l’optimisation de leurs chaînes logistiques. En effet, il est indispensable que ces industriels se focalisent sur le développement de nouvelles approches de pilotage de flux visant à satisfaire le client final tout en minimisant les coûts, afin d’assurer sa pérennité dans une perspective de développement rapide et durable. C’est dans ce contexte que le travail de doctorat s’intéresse à la définition des outils d’aide à la décision qui relèvent de niveaux temporels différents et de fonctions différentes de la chaîne logistique, à savoir la production, le stockage et la distribution des produits en vrac. L’objectif étant d’étudier l’intérêt d’une meilleure coordination entre ces différentes décisions de planification pour répondre aux exigences du marché. Outre son intérêt pratique, la thèse s’inscrit dans le cadre de la recherche opérationnelle et plus précisément se réfère à l’optimisation déterministe et la simulation de flux à évènements discrets, ainsi qu’à leur couplage. Dans ce sens, nous avons développé des outils d’aide à la décision capables de répondre au besoin de l’industriel, que ce soit au niveau opérationnel, tactique ou stratégique. Ainsi, la méthodologie que nous proposons repose sur trois grands axes : Le premier axe vise à définir la méthodologie de modélisation adoptée pour la planification des systèmes logistiques industriels, en vue de réaliser un réseau d’écoulement de flux. Le deuxième axe aborde la réalisation des plans de production, de stockage et de distribution servant des marchés fortement diversifiés (exemple : marchés des engrais). Nous avons proposé trois modèles d’optimisation selon les trois niveaux de planification, minimisant les coûts totaux et maximisant le taux de satisfaction de la demande. Le troisième axe présente un modèle de simulation à évènements discrets pour comprendre et analyser la dynamique du système étudié, pour ainsi évaluer les différents plans fournis par les modèles d’optimisation.
... A further complication arises whenever uncertainty is involved in the decision-making process. The problems of optimisation under uncertainty are characterised by the necessity of making decisions without knowing what their full effects will be (Sahinidis 2004). The main paradigms dealing with uncertain data in problemsolving situations are stochastic programming and robust optimisation. ...
Article
Full-text available
In this paper, we focus on a location-routing problem (LRP) in the dairy industry. This problem consists of locating a cold storage warehouse, from which vehicles of limited capacity are dispatched to serve a given number of supermarkets with uncertain service requirements, and determining the order of supermarkets served by each vehicle. First, the LRP is solved by using a classical approach based on a deterministic model where the service requirements, i.e. customer demands, are defined through sample means. Second, we propose an indifference zone approach to the LRP. The indifference zone procedures are specific ranking and selection methods aimed at selecting the best option from a set of alternative configurations. In particular, they attempt to guarantee the probability of correct choice, while minimising the computational effort. The numerical results presented in the paper highlight the risk of biased decision making when mere sample means are used in a deterministic model. In addition, they show the effectiveness of indifference zone approaches to the dairy products distribution activity.
... According to Zhao et al. (2018) uncertainty can be modeled by several approaches. For example, as a good classification, Sahinidis (2004) categorizes and reviews the main optimization approaches under uncertainty into two groups: (1) stochastic optimization (recourse models, robust stochastic programming, probabilistic models) and (2) fuzzy optimization. In stochastic optimization, there are broadly three types of stochastic programming approaches: expected value Content courtesy of Springer Nature, terms of use apply. ...
Article
Full-text available
Since 2016, hospitals in France have met to form Territorial Hospital Groups (THGs) in order to modernize their health care system. The main challenge is to allow an efficient logistics organization to adopt the new collaborative structure of the supply chain. In our work, we approach the concept of logistics pooling as a form of collaboration between hospitals in THGs. The aim is to pool and rationalize the storage of products in warehouses and optimize their distribution to care units while reducing logistics costs (transportation, storage, workforce, etc.). Besides, due to the unavailability and the incompleteness of data in real-world situations, several parameters embedded in supply chains could be imprecise or even uncertain. In this paper, a Fuzzy chance-constrained programming approach is developed based on possibility theory to solve a network design problem in a multi-supplier, multi-warehouse, and multi-commodity supply chain. The problem is designed as a minimum-cost flow graph and a linear programming optimization model is developed considering fuzzy demand. The objective is to meet the customers’ demand and nd the best allocation of products to warehouses. Different instances were generated based on realistic data from an existing territorial hospital group, and several tests were developed to reveal the benefits of collaboration and uncertainty handling.
... A large number of modelling philosophies such as expectation minimization, goal minimization deviation, maximum cost minimization, and the inclusion of probabilistic constraints have influenced the optimisation approaches under uncertainty [6]. In an optimisation process, the main techniques that take into account uncertainties are stochastic programming [7], fuzzy programming [8], reliability based design optimisation [9] and uncertainty quantification [10,2,11]. ...
Preprint
Supply chain network is critical to serving customers, so the most common practices are to determine the number, location, and capacity of facilities. But at the same time, uncertainties and risks must be taken into account in order to control delays. In this context, many optimisation models have been developed to use the results in transportation network and therefore improve the supply chain performance. Models were developed in both routing and zoning/districting problems, and different cases have been discussed in the literature, such as facility location problems, urban problems, and transportation problems. This paper seeks to review the literature in this area and decompose the models into Mathematical modelling and Geometrical approaches. Distribution is an important part of the supply chain management, it is a process with multiple participants. This characteristic brings a high level of uncertainty. This article therefore presents the distribution process and in particular the design of the transportation network which can include both routing and districting problems.
... While measurement uncertainty is an almost automatic step in the presentation of experimental results, the quantification of uncertainty (UQ) is far less obvious for numerical methods and turbulence modeling in particular [194,276], and the concept has actually only appeared at the beginning of this century. "UQ is a way to understand and to quantify the reliability of analysis predictions" 10 . ...
Thesis
Full-text available
The models used to calculate small-caliber projectile trajectories are often only drag-based, given the presumed short ranges and the assumed small variation of the aerodynamic parameters in flight. Depending on the type of application, "field" calibrations are then performed to compensate for the observed deviations. However, with the new small-caliber applications and the inherent increased challenges, these simplified methods do not yield satisfactory results anymore in terms of accuracy and attitude upon impact. In the first part, next to reviewing existing trajectography models, we discuss their implementation in our own trajectory program VTraj, developed in LabVIEW. The six degrees of freedom (6-DoF) model allows to compute the flight of any symmetrical or asymmetrical projectile (spin- or fin- stabilized). Its parameters include a complete set of static and dynamic contributions, including Magnus and pitch damping forces & moments. This model allows the analysis of all translation and angular motions of the projectile's body. The models give good agreement with published results on standard reference projectiles for the trajectory parameters. In part two, we focus on the methodology to capture the static and dynamic aerodynamic coefficients by steady and unsteady RANS methods for subsonic, transonic and supersonic flight conditions. Accurate resolution of the flow in the boundary layer and in the wake of the projectile proved to be of utmost importance for the correct determination of the coefficients. The coefficient extraction methods are assessed with published results for canonical shapes and good agreement is achieved. The results highlight the strong dependency of the pitch damping coefficient on the reduced pitch frequency which varies along the flight path. Rigid Body Dynamics (RBD) as well as Computational Fluid Dynamics (CFD) are finally combined in order to evaluate the behavior of specific small-caliber applications: non-lethal projectiles operating in the low subsonic domain, long-range projectiles with focus on transonic domain crossing, and asymmetric configuration are studied. The resolution of the dynamic flow around the projectile and the prediction of stability upon impact are confronted with experimental results and the match is very promising. The research also gives new insight into the diverse phenomena at hand in the transonic domain, or for projectiles with mass unbalance, and the change they impart on static and dynamic stability characteristics.
... Literature review/related work: Chance constraints constitute a means to provide certain guarantees on constraint satisfaction under uncertainty [7] and have found applications in several domains including in power system optimization under uncertainty. The authors in [8] provide a chance constraint formulation for distribution systems with renewable energy resources to enforce voltage regulation with prescribed probability and employ convex approximations of the chance constraints to obtain a computationally tractable formulation. ...
Preprint
Full-text available
Critical energy infrastructure are constantly under stress due to the ever increasing disruptions caused by wildfires, hurricanes, other weather related extreme events and cyber-attacks. Hence it becomes important to make critical infrastructure resilient to threats from such cyber-physical events. However, such events are hard to predict and numerous in nature and type and it becomes infeasible to make a system resilient to every possible such cyber-physical event. Such an approach can make the system operation overly conservative and impractical to operate. Furthermore, distributions of such events are hard to predict and historical data available on such events can be very sparse, making the problem even harder to solve. To deal with these issues, in this paper we present a policy-mode framework that enumerates and predicts the probability of various cyber-physical events and then a distributionally robust optimization (DRO) formulation that is robust to the sparsity of the available historical data. The proposed algorithm is illustrated on an islanded microgrid example: a modified IEEE 123-node feeder with distributed energy resources (DERs) and energy storage. Simulations are carried to validate the resiliency metrics under the sampled disruption events.
... Optimisation problems under uncertainty are dealt in the open literature with a wide range of mathematical techniques including stochastic programming (Sahinidis, 2004), chance-constraint programming (Li et al., 2008), robust optimisation (Ben-Tal and Nemirovski, 1999; Bertsimas and ...
Article
Full-text available
Optimisation under uncertainty has always been a focal point within the Process Systems Engineering (PSE) research agenda. In particular, the efficient manipulation of large amount of data for the uncertain parameters constitutes a crucial condition for effectively tackling stochastic programming problems. In this context, this work proposes a new data-driven Mixed-Integer Linear Programming (MILP) model for the Distribution & Moment Matching Problem (DMP). For cases with multiple uncertain parameters a copula-based simulation of initial scenarios is employed as preliminary step. Moreover, the integration of clustering methods and DMP in the proposed model is shown to enhance computational performance. Finally, we compare the proposed approach with state-of-the-art scenario generation methodologies. Through a number of case studies we highlight the benefits regarding the quality of the generated scenario trees by evaluating the corresponding obtained stochastic solutions.
... Impreciseness, lack of knowledge, ambiguousness, and the uncertain nature of some input data have produced a large amount of uncertainty in the Besides identifying various sources of uncertainty, many research studies have attended to different approaches dealing with uncertainty (see Stefansdottir and Grunow, 2018;Gholamian et al., 2015;Naderi et al., 2016). In this regard, some popular approaches namely stochastic programming, fuzzy programming, and robust optimization have been introduced to overcome randomness, epistemic and deep uncertainties respectively (Sahinidis, 2004). In this regard, Kazemi Zanjani et al. (2010) presented a multi-stage stochastic programming approach to handle the uncertainty related to the quality of raw materials and customers' demand. ...
Preprint
Full-text available
This paper studies the effect of the open innovation concept in the product design process and supply chain master planning. Complex uncertainties caused by using outbound resources within co-design processes, financial challenges between the collaborating parties, and integrating outbound innovative designs with supply chain tactical planning problem are taken into account. To this end, a robust fuzzy-stochastic optimization model is proposed which can integrate the product design with the main activities of a multi-product supply chain. The proposed model is able to cope with different type of uncertainties including random, epistemic and deep uncertainties. Integrating the financial and physical flows, using a novel pricing mechanism, and considering the outside-in innovations in the product design process are the outstanding contributions of the proposed model. Furthermore, to cover both the short-term and long-term success criteria, ambidexterity of the studied supply chain is taken into account via two conflicting explorative and exploitive objectives. Results indicate the superiority of the presented model and its ability in supporting managerial decisions in the mid-term planning process.
Chapter
Life cycle assessment (LCA) and technoeconomic analysis (TEA) are essential tools for evaluating biorefinery performance and designing cost-effective and environmentally friendly supply chains. However, biorefinery operations often suffer from significant temporal and spatial uncertainties, including raw material supply and product demands. This work uses stochastic programming and multi-period planning to design a cost-efficient modular biorefinery supply chain under uncertain demand and material supply. Next, the proposed model is used to design and evaluate modular biorefinery performance in the Baltimore-Wilmington-Philadelphia region. Moreover, the optimization result illustrates the seasonal variability of biomass-based product emission due to demand/supply uncertainty, which cannot be captured by the conventional LCA uncertainty analysis.
Article
This paper addresses the problem of designing shell and tube heat exchangers operating under uncertain conditions. Traditional design methods assume constant process conditions and known value of thermal design factors. Thus may fail to meet specifications when operational conditions change. Use of safety factors or worst-case design often leads to conservative design and equipment oversizing. No guideline is available in the literature for choosing a design approach. In order to provide designers better awareness on this issue, we investigate when uncertainty should be taken into account, what are the advantages and drawbacks of available approaches for design under uncertainty, and which is the preferred method to be applied in specific problem instances. To answer the above research questions, comparative sizing of equipment is carried out under consistent assumptions adopting five alternative design methods; namely design for nominal reference condition with verification of off-design performances, worst case design, use of safety margins, design to optimize a prescribed objective function, and robust design. Either random uncertainty in external process conditions, and epistemic uncertainty in heat transfer coefficients and fouling resistances are considered. Different types of specifications (more/less is better or nominal is better) are accounted for. Probability of meeting specifications and resulting geometrical characteristics are compared. It is the first time that a similar comparison is carried out, and results discussion allows to identify merits and limitations of the considered approaches. It is found that no single approach is superior, as any method can be suited to a design situation but perform poorly in other cases. Justification for explaining the observed behaviour is used in order to provide guidance to designers in choosing the preferable design method for a specific instance resulting in more cost-effective equipment configuration with higher probability of meeting specifications.
Article
We present a perspective for accelerating biomass manufacturing via digitalization. We summarize the challenges for manufacturing and identify areas where digitalization can help. A profound potential in using lignocellulosic biomass and renewable feedstocks, in general, is to produce new molecules and products with unmatched properties that have no analog in traditional refineries. Discovering such performance-advantaged molecules and the paths and processes to make them rapidly and systematically can transform manufacturing practices. We discuss retrosynthetic approaches, text mining, natural language processing, and modern machine learning methods to enable digitalization. Laboratory and multiscale computation automation via active learning are crucial to complement existing literature and expedite discovery and valuable data collection without a human in the loop. Such data can help process simulation and optimization select the most promising processes and molecules according to economic, environmental, and societal metrics. We propose the close integration between bench and process scale models and data to exploit the low dimensionality of the data and transform the manufacturing for renewable feedstocks.
Article
This paper concentrates on the formulation of a large-scale nonconvex mixed-integer nonlinear programming model and the application of robust optimization for the multi-period operational planning of real-world integrated refinery-petrochemical site in China under uncertain product demands and crude oil price. To avoid excessive conservativeness resulting from classical static robust optimization, an adjustable robust counterpart incorporating resource decisions via an affinely adjustable linear decision rule is first derived. On the basis of a proposed polyhedral dynamic uncertainty set that mimics the dynamic behavior of the product demand over time, an adjustable robust counterpart with a dynamic uncertainty set is further formulated. Classical static robust optimization, adjustable robust optimization, and adjustable robust optimization with dynamic uncertainty sets are systematically compared for case studies. The results clearly illustrate the advantages of the affinely adjustable robust optimization with a dynamic uncertainty set over the classic robust optimization in decision making.
Article
The oil area is one of those that may most benefit from the improved efficiency of supply chain management. However, the dynamic behavior of such chains is too complex to be tackled by traditional approaches. Moreover, these chains show several intrinsic characteristics in common with multi-agent systems, which offer the required flexibility to model the complexities and dynamics of real supply chains without rather simplifying assumptions. Since the problem of managing the supply chain has a recursive structure, it becomes more convenient to use a holonic agent-based model, which show a fractal-type structure. Furthermore, the type of relationship between entities in the chain and the need for global optimization suggest to model their interactions in the form of a constraint network. For this reason, this work defines a new optimization problem called Holonic Constraint Optimization Problem (HCOP), which is based on concepts from Distributed Constraint Satisfaction Optimization Problem (DCOP) and holonic agents. In addition, we developed a meta-algorithm based on DPOP algorithm for solving this type of problem, using the FRODO framework in an environment where available centralized optimization algorithms are integrated so as to obtain the optimization. Finally, experiments were performed on a case study of the PETROBRAS company, where a typical supply chain of the petroleum industry was modeled as HCOP. Those experiments integrated the optimization systems for production and logistics, which are representative in relation to actual situations, and allowed the verification of the feasibility of this model and its comparison with conventional approaches.
Article
Full-text available
Energy demand is increasingly the most relevant cost item in chemical plants. Operating expenses indeed play a main role in all plants processing large amounts of feedstock via well-established processes in the petrochemical industry. In staged operations, the optimal number of stages is usually obtained by means of an economic optimization. However, the designed equipment, external duties, and thus operating expenses may considerably vary under the effect of external disturbances. The main purpose of this paper is to outline a simple but effective procedure to account for perturbations in the assessment of the optimal number of stages. The analysis shows that appropriate investments could lead to a unit design able to mitigate the higher duty requirements when external perturbations occur. The results highlight that the optimal number of stages varies when uncertainty is considered and, with low computational effort, this can be effectively quantified by means of the applied methodology. Furthermore, the same approach has been applied to sustainability indicators over the uncertain domain as well. In those cases, when more stages correspond to more flexible equipment, the environmental impact is positively affected, and a double benefit can be observed.
Article
In this paper, we attempt to provide a perspective on the field of process systems engineering (PSE) by tracing its evolution back to the pioneering work of Neal Amundson, Rutherford Aris, and Roger Sargent, and highlighting their legacies that continue to guide research in PSE to this day. We underscore the growth in the technical scope of the field from the adoption of control and optimization methods for analyzing chemical engineering problems to the advancement of the methods themselves. We comment on the extended scope, current state, and major trends in PSE while reflecting on the intellectual identity of our field that has developed over time. PSE is positioned to play a crucial role in addressing major societal challenges, such as sustainability and health. While considering a diverse set of applications, it will also be critical to further advance theory which allows us to address the complexity that underlies these problems. Finally, we outline a few emerging research themes that could serve as initial food for thought in a broader discussion on future research directions in PSE.
Article
Centralized wastewater treatment has been the favorite wastewater treatment strategy until a few decades ago, in order to exploit each possible scale economy. Nowadays, water stress and resource scarcity, due to population growth and climate change, call for water reuse and resource recovery, and these goals do not often find in centralization the best solution. Today, the reuse of reclaimed water can take place at different levels and represents an option of primary importance; therefore, in some cases, centralized systems may be economically and environmentally unsustainable for this additional purpose, and the search for the optimal infrastructure centralization degree must take into account these goals. This review analyzes studies that investigated the search of the best centralization level of wastewater collection and treatment, focusing on the methodologies applied to take the decision and highlighting strengths and weaknesses of the different approaches and how they have evolved over time. The final goal is to guide planners and decision-makers in choosing and handling the most suitable method to assess the centralization level of wastewater infrastructures, based on the objectives set out. The reviewed studies cover a period of twenty years. The differences found along this time span show an ongoing paradigm shift towards hybrid systems, which combine centralized and decentralized wastewater treatments that promote the storage of treated water and various forms of local water reuse and resource recovery. The protection of human health and the environment (which primarily promotes water reuse and resource recovery) has become the main challenge of wastewater treatment systems, that will presumably improve further their economic, social and environmental sustainability to achieve urban development in the context of the water-energy-food security nexus.
Article
The operational optimization of refining process is facing the complex coupled state and frequently changed conditions. Especially the feed of fluid catalytic cracking (FCC) has property fluctuations which may lead to uncertainties in profit and lead to suboptimal optimization schemes from the deterministic optimization model. This study designed a bilevel data-driven robust optimization framework that optimizes the feed selection and reaction temperature of an industrial FCC unit under feed property uncertainty. Two uncertainty sets based on the feed properties data were derived from 2-year historical industrial data and simulation data. As most of the chemical reaction models are differential equations, a bilevel programming framework designed in Julia was the key point to solve the nested numerical and mathematic problems. A real-world case study is conducted to demonstrate the effectiveness of the proposed approach in protecting against uncertainties to ensure profits for FCC units.
Conference Paper
Data is generating at an exponential pace with the advance in information technology. Such data highly contain un- certain and vague information. Data engineering deals with the methodologies to assess and evaluate uncertainties in the dataset and generate useful information from the data pool. This work presents a mathematical approach to evaluate the dataset's uncer- tainties and its application to data reduction. The proposed method is used for attribute selection for early-predicting of diabetes. Experimental results show that the prediction accuracy using the rough set method is higher than the other methods.
Chapter
The efficient exploitation of large amount of data for the uncertain parameters constitutes a crucial condition for effectively handling stochastic programming problems. In this work we propose a novel data-driven mixed-integer linear programming (MILP) model for the Distribution Matching Problem (DMP). Ιn cases of multiple uncertain parameters, sampling using copulas is conducted as preliminary step. The integration of clustering methods and DMP in the proposed model is proven to improve the computational efficiency. For the evaluation of the performance of the proposed scenario generation approaches several case studies of a two-stage stochastic programming problem are examined. Compared with state-of-the-art scenario generation (SG) approaches the proposed model is shown to achieve consistently the lowest errors regarding the expected values when compared to full-space stochastic solutions as well as manages to preserve good accuracy in the resulting probabilistic and statistical qualities of the reduced generated sets.
Article
We present a specialized scenario generation method that utilizes forecast information to generate scenarios for day-ahead scheduling problems. In particular, we use normalizing flows to generate wind power scenarios by sampling from a conditional distribution that uses wind speed forecasts to tailor the scenarios to a specific day. We apply the generated scenarios in a stochastic day-ahead bidding problem of a wind electricity producer and analyze whether the scenarios yield profitable decisions. Compared to Gaussian copulas and Wasserstein-generative adversarial networks, the normalizing flow successfully narrows the range of scenarios around the daily trends while maintaining a diverse variety of possible realizations. In the stochastic day-ahead bidding problem, the conditional scenarios from all methods lead to significantly more stable profitable results compared to an unconditional selection of historical scenarios. The normalizing flow consistently obtains the highest profits, even for small sets scenarios.
Article
This paper presents a multi-objective optimization strategy for monetizing shale/natural gas to two important products: ammonia and methanol. This study considers the uncertainty associated with the composition of the feed shale/natural gas stream. The decision variables are the feed flow and the relative allocation of the feedstock to each process. Economic and environmental objectives are used in optimization. The economic objective is to maximize the net profit while considering revenue and production costs. The environmental objective is to minimize total annual CO2 emissions associated with providing the process with the required heat and power for the reformer and the compressors, respectively. Process simulation is carried out using Aspen HYSYS. The improved multi-objective differential evolution algorithm (I-MODE) is used for the optimization strategy. A random generator routine is developed for the creation of uncertain parameter values. The programs are linked through Visual Basic for Applications as a medium to exchange data. After applying the optimization strategy, the results showed an improvement in the objective functions, where the decision-maker can choose the point in a Pareto front according to the specific tradeoff of the targeted objectives. The results offer alternatives with an increase in the net benefit of up to 26.8% with only an increase in emissions of 4.5%.
Article
Full-text available
Many real‐world design problems involve optimization of expensive black‐box functions. Bayesian optimization (BO) is a promising approach for solving such challenging problems using probabilistic surrogate models to systematically tradeoff between exploitation and exploration of the design space. Although BO is often applied to unconstrained problems, it has recently been extended to the constrained setting. Current constrained BO methods, however, cannot identify solutions that are robust to unavoidable uncertainties. In this paper, we propose a robust constrained BO method, CARBO, that addresses this challenge by jointly modeling the effect of the design variables and uncertainties on the unknown functions. Using exact penalty functions, we establish a bound on the number of CARBO iterations required to find a near‐global robust solution and provide a rigorous proof of convergence. The advantages of CARBO are demonstrated on two case studies including a non‐convex benchmark problem and a realistic bubble column reactor design problem. This article is protected by copyright. All rights reserved.
Article
This paper presents a multi-objective optimization approach for the security assessment of the water–energy–food nexus considering the involved uncertainty. The proposed optimization approach consists of a hybrid strategy that combines deterministic and metaheuristic optimization approaches for solving this complex problem. The deterministic optimization part of the mathematical model is developed in the platform general algebraic modeling system (GAMS). The metaheuristic part is approached using the improved multi-objective optimization differential evolution algorithm programmed in visual basic for applications. The communication between the different software is implemented using GAMS data exchange files and linking routines. The uncertainty associated with the problem is considered using a code that generates random values of uncertain parameters of the mathematical model. The optimization approach is applied to assess the water–energy–food nexus in an arid region. Three objective functions (one economic and two environmental) were incorporated in the proposed formulation. A case study from a region of Mexico is addressed to show the applicability of the proposed approach. The results of the optimization process offer alternatives that conciliate economic and environmental interests considering uncertainty in weather conditions such as rainfall and solar radiation. The obtained results show a set of solution proposals in which different increases are offered in the satisfaction of the demand for energy and water in different sectors and at the same time, the smallest possible increase in the economic and environmental objective functions. In addition, the proposed methodology is replicable to different case studies where approaching the optimal solution becomes a complicated task.Graphical abstract
Article
Full-text available
Despite the great progress in improvement methodologies, modernity may be a precedent for this progress. Actually, on the supply chain management scenario the decision-making becomes more challenging especially that various sources of model uncertainty are required to ensure the quality of the solution or even practical feasibility. Therefore, one of the most pressing problems today is incorporating variability in process parameters such as manufacturing time and reaction conditions. In this paper, some interactive methods are summarized that modify the actual plan obtained from the authoritative version of the system to correspond to the modifications or updated system data. Finally, the methods of dealing with problems were divided into two main approaches, the reactive approach and the preventive approach.
Thesis
La planification à court terme de la production conjointe d'une chaîne hydroélectrique et d'actifs de production variable est un problème d'optimisation. Le programme de production hydroélectrique est construit à un horizon de quelques jours pour respecter les contraintes d'exploitation et pour maximiser le chiffre d'affaires obtenu par la vente sur le marché day-ahead et par la pénalisation des écarts, en considérant la production totale de tous les actifs.Le travail de la thèse consiste à prendre en compte les incertitudes qui entachent les prévisions des apports en eau, des prix de l'électricité et des productions variables dans le problème d'optimisation. À partir d'une modélisation des incertitudes par un ensemble fini de scénarios multivariés, le problème d'optimisation en univers probabiliste proposé consiste en un problème de programmation dynamique stochastique à deux niveaux linéaires mixtes.La problématique des temps de calcul étant cruciale pour un usage opérationnel, le problème d'optimisation probabiliste est résolu numériquement en remplaçant la valeur optimale du second niveau par un métamodèle calibré par apprentissage supervisé durant une phase de pré-traitement. Pour calibrer ce métamodèle, les données d'entrée sont simplifiées par des approches spécifiques. Un échantillonnage par analogie permet d'abord d'approcher le domaine de la donnée d'entrée associée aux variables de décision du premier niveau. Les données d'entrée fonctionnelles sont ensuite réduites par analyse en composantes principales. Un plan d'expérience peut ainsi être construit par échantillonnage par hypercube latin pour former le jeu de données nécessaire à l'apprentissage du métamodèle. Plusieurs métamodèles linéaires sont ensuite proposés.La méthodologie proposée est testée sur un cas d'étude réel simplifié. Le métamodèle obtenu par régression linéaire sur les données d'entrée réduites donne des performances acceptables et il permet d'obtenir une solution du problème d'optimisation en univers probabiliste. Néanmoins, en l'absence de productions variables dans le cas d'étude, le problème d'optimisation en univers probabiliste ne permet pas d'apporter des gains significatifs par rapport à celui en univers déterministe. En outre, les temps de calcul ne permettent pas une utilisation en opérationnel sans calcul distribué. Plusieurs pistes de recherche sont toutefois proposées pour améliorer la méthodologie. Une validation sur plusieurs cas d'étude, voire sur un horizon roulant, permettrait d'estimer les performances de la méthodologie de manière plus robuste.
Article
The climatic risk consists of financial and environmental risks, predominantly evolutes from carbon dioxide emissions from coal‐fired power plants. We develop a climatic risk control model to investigate the coexistence of renewable energy and the post‐combustion carbon capture technology as CO2 reduction strategies. Additionally, our proposed framework explicitly considers the acceptance of such strategies based on a balance of electricity supply according to demand. This paper adopts an additive fuzzy mixed‐integer optimization approach to model uncertain parameters and determines the optimal solution focusing on business and the environment. Furthermore, we investigate the feasibility of such emission control strategies with scenario analysis that help to execute the country's emission reduction policies. The usefulness of our methodology is demonstrated using data from the coal‐based power sector in the Eastern part of India. Overall, with the proposed model, we can achieve 30% reduction in emission release, which provides strategies to the decision‐maker for investment towards sustainable development.
Article
In this study, we present a multi-product, multi-period inventory control problem under uncertainty in product demands that emerges in the fashion industry. A two-stage stochastic model is proposed to design a planning strategy where the total cost incurred by purchase orders, inventory and shortage is minimised. We incorporate the Conditional Value at Risk (CVaR) within the formulation to address exogenous uncertainty. An industrial case study involving a Mexican fashion retail company was considered to assess the performance of the two-stage stochastic model. Scenarios were considered using historical data provided by the company. A sensitivity analysis was also conducted on risk-aversion parameters to assess how the values of these parameters affect the behaviour of the proposed formulation. The results show that the proposed two-stage stochastic formulation is an efficient and practical approach to handle exogenous uncertainty in industrial-scale capacitated lot-sizing problems.
Article
Two-stage stochastic scheduling problems often involve a large number of continuous and discrete variables, so fnding solutions in short time periods is challenging and computationally expensive. However, for online or closed-loop scheduling implementations, optimal or near-optimal solutions are required in real-time. We propose a decomposition method based on the so-called Similarity Index (SI). An iterative procedure is set up so that each sub-problem (corresponding to a scenario) is solved independently, aiming to optimize the original cost function while maximizing the similarity of the frst-stage variables among the scenarios solutions. The SI is incorporated into each subproblem cost function, multiplied by a penalty parameter that is updated in each iteration until reaching complete similarity in the frst-stage variables among all subproblems. The method is applied to schedule production and maintenance tasks in an evaporation network. The tests show that signifcant benefts are expected in terms of computational demands as the number of scenarios increases.
Article
This paper addresses the multi-objective optimization of water management used in the extraction of shale gas through the hydraulic fracturing process. The optimization aims to minimize the total annual cost and simultaneously minimize the total water requirements. Two uncertain factors are considered about the available fresh water and flowrate of wastewater (flowback water). Due to the complexity of the problem, a hybrid optimization method is proposed by combining deterministic and metaheuristic optimization approaches. The deterministic model is programmed in the General Algebraic Modeling System (GAMS) commercial software. The metaheuristic algorithm is implemented in Microsoft Excel®. The stochastic tool for uncertainty consideration is a code declared in the Visual Basic for Applications environment. The communication of the necessary programs in the methodology is achieved by the action of code sets also programmed in Visual Basic for Applications. To illustrate the merits of the proposed approach, a case study is solved to address water management for a shale gas field where a variable number of wells from 20 to 60 with different scheduling are considered. The results offer the optimal operation scheduling of the wells accounting for the minimum total annual cost and total freshwater consumption. Additionally, the maximum number of wells is obtained up to where it is possible to find feasible solutions. The best results were found with a maximum number of 51 wells prioritizing the economic objective and with 59 wells if the environmental objective is prioritized.
Article
Parcel logistics services play a vital and growing role in economies worldwide, with customers demanding faster delivery of nearly everything to their homes. To move larger volumes more cost effectively, express carriers use sort technologies to consolidate parcels that share similar geographic and service characteristics for reduced per-unit handling and transportation costs. This paper focuses on an operational planning problem that arises in two-stage sort systems operating within parcel transportation networks. In this context, primary sorters perform an initial grouping of parcels into “piles” that are subsequently dispatched when necessary to secondary sorters; there, each pile’s parcels are fine-sorted based on their final destinations and service class for packing into outbound transportation vehicles. Such systems must be designed to handle a high degree of uncertainty in the quantity and timing of arriving parcels, yet must also group and sort the parcels to meet tight departure deadlines. Thus motivated, this paper presents robust planning models that assign parcels to sort equipment while protecting against different sources of demand uncertainty commonly faced by parcel carriers. We demonstrate the computational viability of the proposed models using realistic-sized instances based on industry data, and show their value in providing sort plan alternatives that trade off operational costs and levels of robustness.
Chapter
The problem of portfolio selection is a decision problem that consists of selecting a set of candidate projects to be supported when optimizing one or more impact measures. As many real‐world decision problems, access to precise and exact information is often impossible. As a result of this, in literature, different methods and techniques to make decision under an uncertain environment have been proposed. In this sense, the representation of imprecise data through fuzzy numbers has become a common practice in researches. In this chapter we study the problem of project portfolio selection when the resources are subject to budget availability. For this, we modeled the uncertainty in the total budget to carry out projects and the uncertainty in restricted budget to different areas using triangular fuzzy numbers. First, we describe a proposed bi‐objective fuzzy model inspired in the project selection model considering the decision of selection and resource allocation at task level. Then, we applied several methods to compare fuzzy numbers in order to transform the fuzzy programming model into a mathematical model with crisp values. Naturally, different methods resulted in different project portfolios. For this reason, the results of the application of the different methods to compare fuzzy numbers are discussed using one multi‐objective solution method for this research.
Book
Full-text available
Fuzzy Set Theory - And Its Applications, Third Edition is a textbook for courses in fuzzy set theory. It can also be used as an introduction to the subject. The character of a textbook is balanced with the dynamic nature of the research in the field by including many useful references to develop a deeper understanding among interested readers. The book updates the research agenda (which has witnessed profound and startling advances since its inception some 30 years ago) with chapters on possibility theory, fuzzy logic and approximate reasoning, expert systems, fuzzy control, fuzzy data analysis, decision making and fuzzy set models in operations research. All chapters have been updated. Exercises are included.
Article
Full-text available
This paper describes how the scenario aggregation principle can be combined with approximate solutions of the individual scenario problems, resulting in a computationally efficient algorithm where two individual Lagrangian-based procedures are merged into one. Computational results are given for an example from fisheries management. Numerical experiments indicate that only crude scenario solutions are needed.
Article
Full-text available
Practical scheduling problems typically require decisions without full information about the outcomes of those decisions. Yields, resource availability, performance, demand, costs, and revenues may all vary. Incorporating these quantities into stochastic scheduling models often produces difficulties in analysis that may be addressed in a variety of ways. In this paper, we present results based on stochastic programming approaches to the hierarchy of decisions in typical stochastic scheduling situations. Our unifying framework allows us to treat all aspects of a decision in a similar framework. We show how views from different levels enable approximations that can overcome nonconvexities and duality gaps that appear in deterministic formulations. In particular, we show that the stochastic program structure leads to a vanishing Lagrangian duality gap in stochastic integer programs as the number of scenarios increases.
Article
Full-text available
This paper considers the two-stage stochastic integer programming problem, with an emphasis on instances in which integer variables appear in the second stage. Drawing heavily on the theory of disjunctive programming, we characterize convexifications of the second stage problem and develop a decomposition-based algorithm for the solution of such problems. In particular, we verify that problems with fixed recourse are characterized by scenario-dependent second stage convexifications that have a great deal in common. We refer to this characterization as the C 3 (Common Cut Coefficients) Theorem. Based on the C 3 Theorem, we develop a decomposition algorithm which we refer to as Disjunctive Decomposition (D 2). In this new class of algorithms, we work with master and subproblems that result from convexifications of two coupled disjunctive programs. We show that when the second stage consists of 0-1 MILP problems, we can obtain accurate second stage objective function estimates after finitely many steps. This result implies the convergence of the D 2 algorithm.
Article
Full-text available
Certain multistage decision problems that arise frequently in operations management planning and control allow a natural formulation as multistage stochastic programs. In job shop scheduling, for example, the first stage could correspond to the acquisition of resources subject to probabilistic information about the jobs to be processed, and the second stage to the actual allocation of the resources to the jobs given deterministic information about their processing requirements. For two simple versions of this two-stage hierarchical scheduling problem, we describe heuristic solution methods and show that their performance is asymptotically optimal both in expectation and in probability.
Article
Full-text available
In this paper we propose a methodology to set prices of perishable items in the context of a retail chain with coordinated prices among its stores and compare its performance with actual practice in a real case study. We formulate a stochastic dynamic programming problem and develop heuristic solutions that approximate optimal solutions satisfactorily. To compare this methodology with current practices in the industry, we conducted two sets of experiments using the expertise of a product manager of a large retail company in Chile. In the first case, we contrast the performance of the proposed methodology with the revenues obtained during the 1995 autumn-winter season. In the second case, we compare it with the performance of the experienced product manager in a "simulation-game" setting. In both cases, our methodology provides significantly better results than those obtained by current practices.
Article
Full-text available
Linear programming is a fundamental planning tool. It is often difficult to precisely estimate or forecast certain critical data elements of the linear program. In such cases, it is necessary to address the impact of uncertainty during the planning process. We discuss a variety of LP-based models that can be used for planning under uncertainty In all cases, we begin with a interministic LP model and show how it can be adapted to include the impact of uncertainty. We present models that range from simple recourse policies to more general two-stage and multistage SLP formulations. We also include a discussion of probabilistic constraints. We illustrate the various models using examples taken from the literature. The examples involve models developed for airline yield management, telecommunications, flood control, and production planning.
Article
Full-text available
The probabilistic traveling salesman problem (PTSP) is defined on a graph G=(V,E), where V is the vertex set and E is the edge set. Each vertex v i has a probability p i of being present. With each edge (v i ,v j ) is associated a distance or cost c ij . In a first stage, an a priori Hamiltonian tour on G is designed. The list of present vertices is then revealed. In a second stage, the a priori tour is followed by skipping the absent vertices. The PTSP consists of determining a first-stage solution that minimizes the expected cost of the second-stage tour. The problem is formulated as an integer linear stochastic program, and solved by means of a branch-and-cut approach which relaxes some of the constraints and uses lower bounding functionals on the objective function. problems involving up to 50 vertices are solved to optimality.
Article
Full-text available
This paper describes the formulation of the Russell-Yasuda Kasai financial planning model, including the motivation for the model. The presentation complements the discussion of the technical details of the financial modeling process and the managerial impact of its use to help allocate the firm’s assets over time discussed in [D. R. Carino et al., Interfaces 24, No. 1, 29-49 (1994); D. R. Carino, D. H. Myers and W. T. Ziemba, Oper. Res. 46, No. 4, 450-462 (1998)]. The multistage stochastic linear program incorporates Yasuda Kasai’s asset and liability mix over a five-year horizon followed by an infinite horizon steady-state end-effects period. The objective is to maximize expected long-run profits less expected penalty costs from constraint violations over the infinite horizon. Scenarios are used to represent the uncertain parameter distributions. The constraints represent the institutional, cash flow, legal, tax, and other limitations on the asset and liability mix over time.
Article
Full-text available
Mathematical programming models with noisy, erroneous, or incomplete data are common in operations research applications. Difficulties with such data are typically dealt with reactively—through sensitivity analysis—or proactively—through stochastic programming formulations. In this paper, we characterize the desirable properties of a solution to models, when the problem data are described by a set of scenarios for their value, instead of using point estimates. A solution to an optimization model is defined as: solution robust if it remains “close” to optimal for all scenarios of the input data, and model robust if it remains “almost” feasible for all data scenarios. We then develop a general model formulation, called robust optimization (RO), that explicitly incorporates the conflicting objectives of solution and model robustness. Robust optimization is compared with the traditional approaches of sensitivity analysis and stochastic linear programming. The classical diet problem illustrates the issues. Robust optimization models are then developed for several real-world applications: power capacity expansion; matrix balancing and image reconstruction; air-force airline scheduling; scenario immunization for financial planning; and minimum weight structural design. We also comment on the suitability of parallel and distributed computer architectures for the solution of robust optimization models.
Article
In this study, an inexact fuzzy multiobjective programming (IFMOP) method is developed and applied to a case study of water pollution control planning in the Lake Erhai Basin. The IFMOP improves upon the existing multiobjective programming methods with advantages in data availability, solution algorithm, computational requirement and result interpretation. The case study project was supported by the United Nations Environment Programme (UNEP). The results indicate that desired schemes for a number of system activities in different subareas/periods were obtained. Inheriting uncertain natures of the model inputs, the majority of solutions present as inexact values which provide decision-makers with a flexible decision space. Generally, the modeling results would provide scientific bases for the formulation of policies/strategies regarding regional socio-economic development and environmental protection.
Article
Stochastic programming - the science that provides us with tools to design and control stochastic systems with the aid of mathematical programming techniques - lies at the intersection of statistics and mathematical programming. The book Stochastic Programming is a comprehensive introduction to the field and its basic mathematical tools. While the mathematics is of a high level, the developed models offer powerful applications, as revealed by the large number of examples presented. The material ranges form basic linear programming to algorithmic solutions of sophisticated systems problems and applications in water resources and power systems, shipbuilding, inventory control, etc. Audience: Students and researchers who need to solve practical and theoretical problems in operations research, mathematics, statistics, engineering, economics, insurance, finance, biology and environmental protection.
Article
The paper considers a sequential Design Of Experiments (DOE) scheme. Our objective is to maximize both information and economic measures over a feasible set of experiments. Optimal DOE strategies are developed by introducing information criteria based on measures adopted from information theory. The evolution of acquired information along various stages of experimentation is analyzed for linear models with a Gaussian noise term. We show that for particular cases, although the amount of information is unbounded, the desired rate of acquiring information decreases with the number of experiments. This observation implies that at a certain point in time it is no longer efficient to continue experimenting. Accordingly, we investigate methods of stochastic dynamic programming under imperfect state information as appropriate means to obtain optimal experimentation policies. We propose cost-to-go functions that model the trade-off between the cost of additional experiments and the benefit of incremental information. We formulate a general stochastic dynamic programming framework for design of experiments and illustrate it by analytic and numerical implementation examples.
Article
Capacity expansion is the process of providing new facilities over time to meet rising demand. A general mathematical model of this process is presented, incorporating uncertain future demand (including the possibility of ‘surprises’), non-zero lead times and random cost overruns. In this model the decision maker controls the rate of investment in the current expansion project. Optimization is studied by methods of stochastic control theory. Numerical algorithms are presented which determine the optimal policy in some simple cases.
Article
In this paper, we focus on two kinds of linear programmings with fuzzy numbers. They are called interval number and fuzzy number linear programmings, respectively. The problems of linear programmings with interval number coefficients are approached by taking maximum value range and minimum value range inequalities as constraint conditions, reduced it into two classical linear programmings, and obtained an optimal interval solution to it. The problems of fuzzy linear programming with fuzzy number coefficients are approached in two ways: i.e. ''fuzzy decisive set approach'' and ''interval number linear programming approach for several membership levels''. Finally, we gave the numerical solutions of the illustrative examples.
Article
Although decisions frequently have uncertain consequences, optimal-decision models often replace those uncertainties with averages or best estimates. Limited computational capability may have motivated this practice in the past. Recent computational advances have, however, greatly expanded the range of optimal-decision models with explicit consideration of uncertainties. This article describes the basic methodology for these stochastic programming models, recent developments in computation, and several practical applications.
Article
In recent years, it has become very important to build mathematical models for uncertain situations of systems in the fields of systems control or systems engineering. Among the uncertain situations of systems there are randomness and fuzziness. Probability theory is usually used for the former, but a new concept becomes necessary for the latter. Fuzzy sets were introduced by L.A. Zadeh to provide a concept for this purpose.Decision-making in a fuzzy environment, which is an important problem in fuzzy systems, was researched by R.E. Bellman and L.A. Zadeh on the basis of the fuzzy sets. In that research an optimal decision is an alternative which attains to the maximum of a membership function of a fuzzy decision.In this paper, we formulate a fuzzy mathematical programming by introducing the idea of level sets on the basis of the optimal decision. The programming decides also a constraint set, because the constraint set is fuzzy. In this meaning a fuzzy mathematical programming differs from ordinary mathematical programming. This fuzzy mathematical programming may be reduced to ordinary mathematical programming under the two sufficient conditions of the α-continuity and the uniqueness of the optimal level. These two conditions are shown in this paper. An algorithm proposed by the authors may be used to solve the problem in which only the condition of α-continuity exists.
Article
The paper provides a stochastic linear programming model which takes into account the effects of some of the random variations in linear programming models for agriculture resource planning. A deterministic equivalent of the proposed model is obtained and its use in agricultural economics is illustrated through a numerical example.
Article
This is a summary of my PhD thesis with the same title. This thesis presents a scenario based optimisation model to analyze the investment policy and funding policy for pension funds, taking into account the development of the liabilities in conjunction with the economic environment. Such a policy will be referred to as an asset liability management (ALM) policy. A model has been developed to compute dynamic ALM policies that: (1) guarantee an acceptably small probability of underfunding, (2) guarantee sufficiently stable future contributions, (3) minimise the present value of expected future contributions by the plan sponsors.
Article
In problems of system analysis, it is customary to treat imprecision by the use of probability theory. It is becoming increasingly clear, however, that in the case of many real world problems involving large scale systems such as economic systems, social systems, mass service systems, etc., the major source of imprecision should more properly be labeled ‘fuzziness’ rather than ‘randomness.’ By fuzziness, we mean the type of imprecision which is associated with the lack of sharp transition from membership to nonmembership, as in tall men, small numbers, likely events, etc. In this paper our main concern is with the application of the theory of fuzzy sets to decision problems involving fuzzy goals and strategies, etc., as defined by R. E. Bellman and L. A. Zadeh [1]. However, in our approach, the emphasis is on mathematical programming and the use of the concept of a level set to extend some of the classical results to problems involving fuzzy constraints and objective functions.
Article
This paper considers vehicle routing problems (VRPs) with stochastic service and travel times, in which vehicles incur a penalty proportional to the duration of their route in excess of a preset constant. Three mathematical programming models are presented: a chance constrained model, a three-index simple recourse model and a two-index recourse model. A general branch and cut algorithm for the three models is described. Computational results indicate that moderate size problems can be solved to optimality.
Article
The problem of expanding the capacity of a single facility in telecommunications network planning is addressed. This problem can be formulated as a time-dependent knapsack, when relevant information is assumed to be known. We introduce the use of scenarios to model uncertainty in key data. The problem is formulated within the robust optimization framework andsolved exactly in two phases. The first phase consists of a dynamic programming recursion and the second one of a shortest path procedure. Experiments show that a large number of scenarios can be handled with this technique, because computational times are more sensitive to the maximum demand across all scenarios than to the number of scenarios considered. A user-interface based on Microsoft Excel is developed as a decision support system for network planners.
Article
One of the key components of chemical plant operability is flexibility—the ability to operate over a range of conditions while satisfying performance specifications. A general framework for analyzing flexibility in chemical process design is presented in this paper. A quantitative index is proposed which measures the size of the parameter space over which feasible steady-state operation of the plant can be attained by proper adjustment of the control variables. The mathematical formulation of this index and a detailed study of its properties are presented. Application of the flexibility in design is illustrated with an example.
Article
Capacity expansion is the process of providing new facilities over time to meet rising demand. A general mathematical model of this process is presented, incorporating uncertain future demand (including the possibility of 'surprises'), non-zero lead times and random cost overruns. In this model the decision maker controls the rate of investment in the current expansion project. Optimization is studied by methods of stochastic control theory. Numerical algorithms are presented which determine the optimal policy in some simple cases.
Article
The potential conflict between protection of water quality and economic development by different uses of land within river basins is a common problem in regional planning. Many studies have applied multiobjective decision analysis under uncertainty to problems of this kind. This paper presents the interactive fuzzy interval multiobjective mixed integer programming (IFIMOMIP) model to evaluate optimal strategies of wastewater treatment levels within a river system by considering the uncertainties in decision analysis. The interactive fuzzy interval multiobjective mixed integer programming approach is illustrated in a case study for the evaluation of optimal wastewater treatment strategies for water pollution control in a river basin. In particular, it demonstrates how different types of uncertainty in a water pollution control system can be quantified and combined through the use of interval numbers and membership functions. The results indicate that such an approach is useful for handling system complexity and generating more flexible policies for water quality management in river basins.
Article
This paper considers a class of capacitated facility location problems in which customer demands are stochastic. The problem is formulated as a stochastic integer linear program, with first stage binary variables and second stage continuous variables. It is solved to optimality by means of a branch and cut method. Computational results are reported for problems involving up to 40 customers and 10 potential facility locations.
Article
The combinatorial nature of problems in process systems engineering has led to the establishment of mixed-integer optimization techniques as a major paradigm in this area. Despite the success of these methods in tackling practical sized problems, the issue of exponential increase of the computational effort with the problem size has been of great concern. A major open question has been whether there is any hope of ever designing ‘efficient’ exact algorithms for this class of problems. Further, if such algorithms are not possible, one would like to know whether provably good approximation schemes can be devised. In this paper, we pursue analytical investigations to provide answers to these two questions in the context of the process planning problem. By means of a computational complexity analysis, we first prove that the general process planning problem is NP-hard, and thus efficient exact algorithms for this class of problems are unlikely to exist. Subsequently, we develop an approximation scheme for this problem and prove, via probabilistic analysis, that the error of the heuristic vanishes asymptotically with probability one as the problem size increases. Finally, we present computational results to substantiate the theoretical findings.
Article
We study a problem of determining the production schedule of stype goods, such as clothing and consumer durables, under capacity constraints. Demand for items is stochastic and occurs in the last season of the planning horizon. Demand estimates are revised in each period. We exploit the problem's two-level hierarchical structure, which is characterized by families and items. Production changeover costs from one family to another are high, compared to other costs. However, changeover costs between items in the same family are negligible. We first formulate this problem as a difficult-to-solve stochastic mixed integer programming problem. Then, exploiting the problem's hierarchical structure, we formulate a deterministic, mixed integer programming problem and solve it by means of an algorithm that provides an approximate solution. A lower bound is obtained by applying generalized linear programming to the approximate problem. We illustrate the procedure using the disguised data of a consumer electronics company.
Article
This paper presents a methodology for the solution of multistage stochastic optimization problems, based on the approximation of the expected-cost-to-go functions of stochastic dynamic programming by piecewise linear functions. No state discretization is necessary, and the combinatorial “explosion” with the number of states (the well known “curse of dimensionality” of dynamic programming) is avoided. The piecewise functions are obtained from the dual solutions of the optimization problem at each stage and correspond to Benders cuts in a stochastic, multistage decomposition framework. A case study of optimal stochastic scheduling for a 39-reservoir system is presented and discussed.
Article
To each stochastic program corresponds an equivalent deterministic program. The purpose of this paper is to compile and extend the known properties for the equivalent deterministic program of a stochastic program with fixed recourse. After a brief discussion of the place of stochastic programming in the realm of stochastic optimization, the definition of the problem at hand, and the derivation of the deterministic equivalent problem, the question of feasibility is treated in 4 with in 5 a description of algorithmic procedures for finding feasible points and in 6 a characterization of aspecial but important class of problems. Section 7 deals with the properties of the objective function of the deterministic equivalent problem, in particular with continuity, differentiability and convexity. Finally in 8, we view the equivalent deterministic program in terms of its stability, dualizability and solvability properties.
Article
Production capacity has always been one of the most important strategic variables for the major automobile companies. Decisions by individual companies concerning the overall level of capacity, the type of facility (e.g., the level of flexibility), and the location of that capacity (e.g., in the United States or abroad) are discussed in great detail in the popular business press. In this paper, we describe a model developed for General Motors to aid in making decisions about capacity for four of their auto lines. The model incorporates elements of scenario planning, integer programming, and risk analysis. All the input and output is done using Lotus 1-2-3. Although the presentation is motivated by the particular application in the auto industry, the model represents a general purpose approach that is applicable to a wide variety of decisions under risk. An example in this paper uses actual data, appropriately transformed to ensure confidentiality.
Article
We develop a general framework for the study and the control of the eutrophication process of shallow lakes. The randomness of the environment variability in hydrological and meteorological conditions is an intrinsic characteristic of such systems that cannot be ignored in the analysis of the process or by management in the design of control measures.
Article
This paper is concerned with a new linearization strategy for a class of zero-one mixed integer programming problems that contains quadratic cross-product terms between continuous and binary variables, and between the binary variables themselves. This linearization scheme provides an equivalent mixed integer linear programming problem which yields a tighter continuous relaxation than that obtainable via the alternative linearization techniques available in the literature. Moreover, the proposed technique provides a unifying framework in the sense that all the alternate methods lead to formulations that are accessible through appropriate surrogates of the constraints of the new linearized formulation. Extensions to various other types of mixed integer nonlinear programming problems are also discussed.
Article
Consideration is given to a multistage stochastic program with recourse, with discrete distribution, quadratic objective function and linear inequality constraints. It is shown that under reasonable assumptions, solving such a program is equivalent to solving a nested sequence of piecewise quadratic programs and the author extends the algorithm presented in an earlier report to the multistage situation. The application of the method to an energy investment problem and report on the results of numerical experiments are presented.
Article
This paper gives an algorithm for L-shaped linear programs which arise naturally in optimal control problems with state constraints and stochastic linear programs (which can be represented in this form with an infinite number of linear constraints). The first section describes a cutting hyperplane algorithm which is shown to be equivalent to a partial decomposition algorithm of the dual program. The two last sections are devoted to applications of the cutting hyperplane algorithm to a linear optimal control problem and stochastic programming problems.
Article
In this paper the problem of establishing the optimal trade-off between investment costs for the retrofit and expected revenue that results from increasing flexibility in systems described by nonlinear models is addressed. A systematic procedure is first proposed for constructing the cost vs flexibility curve. A stochastic optimization method is then presented for evaluating the expected optimal revenue at a number of redesigns with specified degree of flexibility with which the trade-off curve relating expected revenue to flexibility is generated. This allows one to identify the level of flexibility that maximizes the expected profit in a retrofit design. Examples are presented to illustrate the proposed strategy.
Article
In this paper two realistic aspects are imposed on a transportation system. These are a budgetary constraint and random demands at various destinations. Goal programming has been employed to solve the problem under budgetary constraint with deterministic demands. FGP has been used to solve the problem with random demands. A numerical example has been worked out to highlight the solution procedures.
Article
For a practical bank hedging decision optimization problem, interest rates and price of futures contract may involve both fuzziness and randomness. For subjective nature of satisfaction, maximum desired values of loan demand, deposit supply and ratio of desired loan to deposit are often fuzzy. In this study, we consider and solve a stochastic possibilistic programming model of bank hedging decision problems with the above characters. We first use the expected value to obtain an auxiliary possibilistic linear programming problem which is further resolved by use of β-level cut. An (crisp) auxiliary bi-objective linear programming model is then proposed and solved by our augmented maximum approach. For illustration purpose, a numerical bank hedging decision problem is solved.
Article
The minimization of a convex function of variables subject to linear inequalities is discussed briefly in general terms. Dantzig's Simplex Method is extended to yield finite algorithms for minimizing either a convex quadratic function or the sum of the t largest of a set of linear functions, and the solution of a generalization of the latter problem is indicated. In the last two sections a form of linear programming with random variables as coefficients is described, and shown to involve the minimization of a convex function.