Article

Risk Modeling, Assessment, and Management

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Introduction to Modeling and Optimization Classical Unconstrained Optimization Problems Classical Equality Constraint Problem Newton-Raphson Method Linear Programming Dynamic Programming Generalized Nonlinear Programming Multiobjective Decision Trees Derivation of the Expected Value of a Log-normal Distribution Derivation of the Conditional Expected Value of a Log-normal Distribution Triangular Distribution: Unconditional and Conditional Expected Values Standard Normal Distribution Probability Table

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... First, managerial threat perceptions are likely to affect firm preparation. Strategic decisions around disaster preparation are likely to be affected by managers' subjective judgments and/or knowledge about disaster risks (Haimes, 2009;Miceli, Sotgiu, & Settani, 2008;Slovic, 1987;Wachinger, Renn, Begg, & Kuhlicke, 2013: 1051. Depending upon the nature of their experience, managers may either over-or underestimate disaster risk and thus over or under prepare. ...
... Translating experience into action requires an understanding of the mechanisms linking experience and preparation. Two of these mechanisms are managers' subjective judgments and/or knowledge about disaster risks (Haimes, 2009;Miceli et al., 2008;Slovic, 1987;Wachinger et al., 2013Wachinger et al., : 1051 and the political power to make or influence strategic decisionmaking. Ironically, managers may be less likely to utilize experience if they are risk averse. ...
... Managers and their organizations must consider natural disasters to be serious risks for them to be considered salient. In fact, there is strong evidence that managers and their organizations barely consider the statistical likelihood or perceived magnitude of certain risks when assessing perceived threats (Haimes, 2009;Miceli et al., 2008;Wachinger et al., 2013Wachinger et al., : 1051. One reason is that research on firm preparation for future threats -particularly those that are uncertain, infrequent or rare -takes a back seat at times to addressing immediate threats that are more certain (Tashman & Rivera, 2016). ...
Article
Research emphasizes the value of disaster preparation and the importance of experience in doing so, yet most companies fail to prepare. The antecedents of preparation are poorly understood, in part, because experience by itself only partly explains the story. To address these concerns, we developed two unique surveys: one from an international survey in 18 disaster‐prone countries and another from a U.S. survey in New York City and Miami. We find that organizational experience with natural disasters increases preparedness for future hazards. Also, organizational learning from others positively mediates this relationship. Managers are more willing to learn from others in locations characterized by high impact, low frequency disasters. In areas with low impact, high frequency disasters, managers more likely misjudge the severity of natural disasters.
... Project risk management (PRM) is concerned with reducing delays and cost overruns, while satisfying specification (spec) and quality requirements (Chapman and Ward, 2003). Hoffman and Haimes's studies [3,4] indicated that system objectives in Operational risk management (ORM) such as reliability, safety, security, availability, and business continuity in operational settings subject to risk. Analytical risk-integrated system modelling attempts to define the system's (multi-)objective function, while capturing risk, using mathematical building blocks such as input, output, state variables, decision (control) variables, and random variables. ...
... 3. Distribution module that defining transparencies and functions required to realize these transparencies (Figure 7). 4. Conformance module that addressing implementation, consistency and conformance checking requirements (Figure 8). ...
... 3. Risk Analysis: Assign values to the consequence and the likelihood of occurrence of each threat identified in sub-process 2 ( Figure 14). 4. Risk Evaluation: Identify the level of risk associated with the threats already identified and assessed in the previous sub-processes (Figure 15 Their relationships can be provided as Figure 11, The sub-process are shown from Figure 12 to Figure 15: Figure 11. ...
... Although every construction project is different and each should be considered afresh, the process of risk management comprises a fixed set of techniques that can be applied to any project (Flanagan & Norman, 1993). Hence, a good management of a project must incorporate an effective risk management in the form of a standardised process (Haimes, 2009). Having reviewed the literature, the steps below were found for an effective risk management process: ...
... To make it more clear, this step aims at answering the following question: What can go wrong? (Haimes, 2009). ...
... The response management process is a decision-making process (Haimes, 2009) that has two objectives: (a) eliminating the adverse impacts as much as possible, and (b) increasing control over risk (Al-Bahar & Crandall, 1990). There are four common types of responses to handle risks in the literature that are defined as follows: ...
Thesis
Full-text available
Tolerance-related problems are amongst the most common, recurring defects in construction projects. They are often dealt with on an ad hoc basis and at the time and place of the assembly process. The existing academic literature and the industrial guidelines provide only general recommendations for the improvement of tolerance management, and a pragmatic and holistic process for this purpose is still missing. This research aims at developing a process to proactively identify and prevent tolerance problems at the stages preceding assembly on site. Design Science Research is the adopted methodological approach as the focus is on prescribing a solution to solve a practical problem, as well as on contributing to theory. To design a workable solution, the literature not only in construction but also in manufacturing is reviewed, empirical data is collected from three cases, fifteen tolerance problems are documented and analysed, and a detailed root cause analysis is performed for the identified tolerance problems. The solution devised is a process, called Tolerance Management System (TMS), which has five parts, each comprising a set of steps, documents, methods and techniques implemented through a particular organisational design. The parts of TMS are: identification of tolerance requirements/risks, planning the achievement of tolerance requirements/mitigation of tolerance risks, communication of tolerance information, tolerance compliance measurement, and learning and documentation. Process standardisation and continuous improvement are two foundational elements of lean that are employed in TMS. Two focus group meetings are conducted to evaluate whether the developed solution fulfils its aim and to refine it further. It was pinpointed during the focus group meetings that many of the TMS steps could be adopted in practice immediately to help practitioners deal with tolerances more systematically. The research results in contributions to the theory by providing a better understanding of not only a typical but also an advanced practice of tolerance management in the construction industry, and by providing a comprehensive list of root causes of the identified tolerance problems. A contribution to both theory and practice is the developed solution, TMS, by which (a) tolerances can be taken systematically into account from project inception to completion, (b) tolerance information can be effectively communicated amongst designers and construction trades, and (c) the conventional focus on the compliance of deviations of a single component with standards is shifted to whether sub-assemblies function properly within the specified tolerances.
... However even if the prospective probabilities were known, whether or not, and to what extent, nonlinear appraisal weightings ought to be applied to either the estimates of negative impact (see Haimes 1998) or their probability of occurrence (see for example, Tversky and Wakker 1995) is not something that can be addressed by appeal to facts alone. How people perceive and appraise risk individually may be partially informative. ...
... The question of what constitutes a prudent decision in the face of uncertainty over difficult-to-quantify qualitative public policy issues of industrial development, greenhouse, democratic participation and workplace social structures echoes similar questions over technical risks, where the potential outcomes (if not the likelihoods) are more quantifiable, such as operational radiation leaks (mSv of biological dosage received) or environmental contamination during waste disposal (kg of contaminant). Obvious methods of evaluating over risk such as probabilistic expected value and probability of exceedence measures (Haimes 1998) are not, by any means, evidently appropriate. This is particularly so due to the extremely high social costs of some of the possible consequences of nuclear contamination or global climate change, even if they are highly unlikely. ...
Article
Full-text available
This essay was prompted by a recent request for the Society for Sustainability and Environmental Engineering to support a proposed establishment of a chair of Nuclear Physics at an Australian University.
... The process of communication and consultation aims to ensure that communication and consultation with internal and external stakeholders occur during all stages of the risk management process. The establishment of a context provides the scope and risk criteria and sets the external and internal parameters that must be considered in risk management [20]. ...
... Treatment of risks involves the selection of the most suitable alternatives to modify the risks, together with the plans necessary to implement them. In turn, the process of monitoring and critical analysis aims to ensure that controls related to risks are effective and efficient, as well as to obtain additional information to improve the process of risk assessment [20]. ...
Article
Full-text available
The Information and Communication Technology Master Plan---ICTMP---is an important tool for the achievement of the strategic business objectives of public and private organizations. In the public sector, these objectives are closely related to the provision of benefits to society. Information and Communication Technology (ICT) actions are present in all organizational processes and involves size-able budgets. The risks inherent in the planning of ICT actions need to be considered for ICT to add value to the business and to maximize the return on investment to the population. In this context, this work intends to examine the use of risk management processes in the development of ICTMPs in the Brazilian public sector.
... It has become an effective and comprehensive procedure in the overall management to control processes and reduce faulty products as well as related costs. [1] A recent development in the standardization of quality management systems has led to new challenges for companies: The revision of the ISO 9001 in 2015 with a new emphasis on risk management including the end of the transition period in 2018. To gain or maintain the ISO certification, companies are forced to reevaluate their production and quality control processes regarding the risk-based thinking now required. ...
... To understand risk, risk has to be divided into two components -a real one (the potential damage) and a mathematical constructed componentthe probability (the likelihood of the occurrence of potential damages). [1] The state of the art in companies in risk assessment is the Failure Mode and Effects Analysis (FMEA) developed by NASA in 1963 [7]. The basic idea of the FMEA is an analysis by a team performed on a system as well as faults and failures that can occur in it. ...
Chapter
The revision of the ISO 9001 in 2015 and the end of the transition period in 2018 forces companies to integrate risk management into their company structure. Risk assessment as part of risk management poses challenges for many companies, especially SMEs. Existing methods are often complex, subjective and difficult to automate. To address this issue, this paper describes a risk assessment approach that can be fully automated after expert process evaluation. The automated risk assessment is based on a Machine Learning algorithm, which builds a model that predicts the output and allows the use of SPC control charts without measuring component’s characteristics. Based on the results of the control charts, the risk can be assessed by calculating the distances to critical values and analyzing the control chart (e.g. run or trend identification). The use of process parameters, which are recorded by sensors, makes it possible to intervene in the process in high risk situations and reduce not only measurements but also the production of scrap. The method was applied to the use case of an injection molding process of a thin-walled thermoplastic. Based on a Design of Experiments the model was built by a Generalized Linear Regression machine learning algorithm. A predictive validation and an event validation test were used to validate the method. A two-sided t-test at a significance level of α = 5% provided equality between predicted and actual mean value. The Event Validation Test provided a 90–100% correct classification.
... Scenario planning may be prompted by a diverse set of questions, and the most raised question in SA is "what can go wrong?. " The scenario organizing mechanism of Hierarchical Holographic Modeling (HHM) is useful to demonstrate one available framing tool to answer this question. HHM is regarded as "..a holistic philosophy/methodology aimed at capturing and representing the essence of the inherent diverse characteristics and attributes of a system" [22,23]. The term holographic refers to the multiview image of the system that include views of economic risk, social risk, political risk, …etc. ...
Article
Full-text available
Conventional country risk and political risk indexes, to formalize the process, have attempted to standardize and generalize assessment models for factors that are highly context specific. Hence, the value derived from traditional political risk indexes lack precision and are therefore less reliable. This paper re-examines political risk analysis and explains how understanding the topology and nature of political risk in emerging and developing markets is a crucial advancement in developing political risk analysis for the private sector and government agencies. Particular focus is given to develop political risk characterization as a risk analysis category. To bridge the conceptual gap between risk assessment and risk management, this paper proposes the concept of complex adaptive systems as the backdrop for emerging political risk scenarios.
... The conducted model is applicable for other comparable systems with more than two security zone areas. Haimes [56] proposed the use of multiple techniques for estimating terrorist actions as probabilities, and this study is in very good agreement. A PRA has been found to be a useful method for assessing terrorism risks, particularly for forming a baseline comparison of these risks. ...
Article
Full-text available
This paper highlights a risk-based decision-making framework on a basis of probabilistic risk assessment (PRA). Its aim is to enable stakeholders of transport infrastructures to systematically and effectively allocate their limited resources and consequently improve resilience when facing the potential risk of a terrorist attack. The potential risk of a terrorist attack affects the inter-operation of transportation infrastructures including airports and rail stations, the regional economy, and imposes additional costs of security or any countermeasures. This novel framework is thus established in order to model the security system, to consider a multitude of threat scenarios, and to assess the decisions and choices taken by the aggressors during various stages of their attack. The framework has capability to identify the state of partial neutralization, which reveals the losses incurred when the terrorist could not reach the primary target. In this study, an underground railway station interconnected to an international airport has been used as a case study to demonstrate the effectiveness of this novel framework. By the rigorous assessment of potential losses during a variety of threat scenarios, four countermeasures that could minimise losses are proposed: screening of passengers by observation techniques (SPOT), a surveillance system, increase of the cargo screening rate, and blast-resistant cargo containers. The cost and efficiency assessment is employed to determine the most suitable countermeasures when the value of the security measures equal their cost. Note that ongoing research is still needed to establish better countermeasures since there is no end to the creativity of terrorists. The new technology, such as wireless sensors, will play an important role in the security system in the future. In particular, this study will help insurance and rail industries to model and manage risk profiles at critical infrastructure.
... Seeking alternative systems perspectives can also be a more mechanistic process, in which aspects of a system are considered in a hierarchical sense, such as using hierarchical holographic modelling (Haimes 2009). In each case, assumptions must be identified and challenged to ensure we have the best possible understanding of the system before embarking on a system development. ...
Conference Paper
Based on 45 years of experience conducting research and development into spacecraft instrumentation and 13 years' experience teaching Systems Engineering in a range of industries, the Mullard Space Science Laboratory at University College London (UCL) has identified a set of guiding principles that have been invaluable in delivering successful projects in the most demanding of environments. The five principles are: 'principles govern process', 'seek alternative systems perspectives', 'understand the enterprise context', 'integrate systems engineering and project management', and 'invest in the early stages of projects'. A common thread behind the principles is a desire to foster the ability to anticipate and respond to a changing environment with a constant focus on achieving long-term value for the enterprise. These principles are applied in space projects and have been spun-out to non-space projects (primarily through UCL's Centre for Systems Engineering). They are also embedded in UCL's extensive teaching and professional training programme. © 2012 by Author Name.
... Sensitivity analysis studying the variation of value function with respect to the value changes of certain parameter plays a vital role in risk management in derivative market, especially for portfolio pricing and hedging (cf. Haimes [1]). As is often achieved by estimating the Greeks, price sensitivities related to variations of model parameters are calculated and investigated. ...
Article
Full-text available
Sensitivity analysis is at the core of risk management for financial engineering; to calculate the sensitivity with respect to parameters in models with probability expectation, the most traditional approach applies the finite difference method, whereafter integration by parts formula was developed based on the Brownian environment and applied in sensitivity analysis for better computational efficiency than that of finite difference. Establishing a similar version of integration by parts formula for the Markovian environment is the main focus and contribution of this paper. It is also shown by numerical simulation that our proposed methodology and approach outperform the traditional finite difference method for sensitivity computation. For empirical studies of sensitivity analysis on an NPV (net present value) model, we show the approaches of modeling, especially for parameter estimation of Markov chains given data of company loan states. Applying our newly established integration by parts formula, numerical simulation estimates the variations caused by the capital return rate and multiplier of overdue loan. Furthermore, managemental implications of these results are discussed for the effectiveness of modeling and the investment risk control.
... More specific systems are presented and discussed in the scientific literature, in textbooks and papers on risk management (e.g. [10,18,28,35]). Over the years, we have seen a gradual development of these, from pure probability-based risk assessment approaches to broader risk-management frameworks, highlighting both risk assessments and robust/resilient strategies. ...
Article
Full-text available
There is an increasing awareness and recognition of the importance of reflecting knowledge and lack of knowledge in relation to the understanding, assessment and management of risk. Substantial research work has been initiated to better link risk and knowledge. The present paper aims to contribute to this work by distinguishing between different types of knowledge: general knowledge and specific knowledge. For example, in relation to an offshore installation, the former captures knowledge about what could happen and why on offshore installations in general, whereas the latter covers more detailed knowledge related to the specific installation of interest and its operation. Risk management is viewed as the process of making sure that the general knowledge is sufficiently and efficiently used, including the identification of the specific knowledge needed, and ensuring that we have sufficient specific knowledge and control when assessing risk and making decisions. In the paper, we present a risk management framework built on these ideas and knowledge distinction. This framework clarifies interactions between the two knowledge bases, and how these bases can be used to improve the foundation and practice of risk assessment and management.
... Waller (2002) applied the system concept more rigorously both at sociological levels to relate preservation objectives to the ultimate goals of the continuance and betterment of society (fi g. 2) and at the detail level to relate causal factors to risks. As discussed by Haimes (1998), this form of hierarchical system modelling enables and facilitates comprehensive risk identifi cation. At the institutional level, preventive care begins with appropriate collection policies and suitable management of the facilities (Merritt 2008(Merritt , 2010Simmons 2005; also see Joplin, chapter 15, this volume). ...
Chapter
Full-text available
Collections are created and maintained to contribute to the continuance and betterment of society. They are the material evidence of links between our past, present and future. Collections created yesterday remain available to benefit society by being held and maintained, free from preventable loss. At the same time, they must be readily available for use. An effective and efficient collection storage system must meet both those challenges while retaining flexibility for future development of the collection. Addressing these challenges in a systematic and cost-effective manner requires the development of collection management and conservation strategies based on the best and most comprehensive information available. That is why this book draws on expertise from architects, engineers, scientists, facility managers, security specialists, risk analysts, and diverse others, as well as collection managers and conservators. In some cases, information is available in the form of feedback on how well a goal is being achieved. For example, the time required to deliver a specific item form a collection to a user following a request is an easily and accurately measured number. It can be used as a performance indicator to signal to management when problem occurs in the collection use function. Unfortunately, many aspects of collection management, especially preservation, do not provide timely feedback. In those cases, skillful allocation of resources must be made today, and in the near future, to ensure a good result in the medium to long term future with little benefit from feedback. Improving the chances of future success relies on commitment to comprehensive issue identification and quantification together with thorough documentation of all available evidence. A key to success is the establishment of a collections conservation program for the institution. The goal of such a conservation program is the long-term preservation of the utility of the collections; thus any conservation approach must facilitate access to the collections and preserve their integrity for research and other uses. Both treatment and prevention are important aspects of conservation, but prevention is by far the most appropriate approach for preservation from a Western philosophical perspective. Preventive conservation programs are built on a foundation of collection risk assessment but must include a staged plan to mitigate risks while maintaining or enhancing accessibility. Implementation plans should be broken into manageable portions to meet short-, medium-, and long-range goals. Resources are available to assist museum personnel in acquiring appropriate conservation information and expertise to assess collection needs and to implement collections care plans/strategies.
... Instead, the service life of bridges should be extended as far as possible due to sustainability reasons (Kühn et al. (2008); Jensen et al. (2008)). Evaluating the possible interventions and corresponding consequences can facilitate a rational decision (Haimes (2005)), which steps further into risk analysis. An influence diagram or a decision tree model are usually applied for decision support (Hao (2000); Nielsen and Sørensen (2011); Goyet et al. (2013);Leander et al. (2018)). ...
... In this case, it is required to compare several random variables synthesized through their percentiles and statistical moments. Several approaches have been proposed to this end, such as a simple comparison of the expected value, the expected utility (Von Neumann & Morgenstern, 1947), the use of low order moments (Markowitz, 1952), risk measures (Jorion, 2007;Mansini, Ogryczak, & Speranza, 2007;Rockafellar & Uryasev, 2000), the Partitioned Multiobjective Risk Method (Asbeck & Haimes, 1984;Haimes 2009), and the stochastic dominance theory (Levy, 2006), among others. Therefore, the final assessment is derived using a combined approach based on a nonparametric aggregation rule (using the concept of average rank) for attributes 1 and 2; a simple procedure for score assignment for attribute 3; and a lexicographic rule. ...
Article
Full-text available
This article shows a reusable, extensible, adaptable, and comprehensive advanced analytical modeling process to help the U.S. Department of Defense (DoD) with risk-based capital budgeting and optimizing acquisitions and programs portfolios with multiple stakeholders while subject to budgetary, risk, schedule, and strategic constraints. The article covers traditional capital budgeting methodologies in industry and explains how these traditional methods can be applied in the DoD by using DoD-centric non-economic, logistic, readiness, capabilities, and requirements variables. Portfolio optimization for the purposes of selecting the best combination of programs and capabilities is also addressed, as are alternative methods such as average ranking, risk metrics, lexicographic methods, PROMETHEE, ELECTRE, and others. Finally, an illustration from the Program Executive Office Integrated Warfare Systems (PEO-IWS) and Naval Sea Systems Command (NAVSEA) showcases the methodology’s application in developing a comprehensive and analytically robust case study that senior leadership at the DoD may utilize to make optimal decisions.
... In this case, it is required to compare several random variables synthesized through their percentiles and statistical moments. Several approaches have been proposed to this end, such as a simple comparison of the expected value, the expected utility (Von Neumann & Morgenstern, 1947), the use of low order moments (Markowitz, 1952), risk measures (Jorion, 2007;Mansini, Ogryczak, & Speranza, 2007;Rockafellar & Uryasev, 2000), the Partitioned Multiobjective Risk Method (Asbeck & Haimes, 1984;Haimes 2009), and the stochastic dominance theory (Levy, 2006), among others. Therefore, the final assessment is derived using a combined approach based on a nonparametric aggregation rule (using the concept of average rank) for attributes 1 and 2; a simple procedure for score assignment for attribute 3; and a lexicographic rule. ...
Article
This article shows a reusable, extensible, adaptable, and comprehensive advanced analytical modeling process to help the U.S. Department of Defense (DoD) with risk-based capital budgeting and optimizing of acquisition and program portfolios with multiple stakeholders while subject to budgetary, risk, schedule, and strategic constraints. The article covers traditional capital budgeting methodologies in industry and explains how these methods can be applied in the DoD by using DoD-centric, noneconomic, logistic, readiness, capabilities, and requirements variables. Portfolio optimization for the purposes of selecting the best combination of programs and capabilities is also addressed, as are alternative methods such as average ranking, risk metrics, lexicographic methods, PROMETHEE, ELECTRE, and others. Finally, an illustration from the Program Executive Office Integrated Warfare Systems (PEO-IWS) and Naval Sea Systems Command (NAVSEA) showcases the methodology’s application in developing a comprehensive and analytically robust case study that senior leadership at the DoD may utilize to make optimal decisions.
... His tasks are to identify the risks specific to each activity; to analyze each risk, its nature, causes and its relationship to other risks; measure the degree of risk, probability of the accident and estimate the size of the loss, choose the most appropriate means of managing each of the existing risks according to the required safety and cost. For further information, see references (Asfahl and Rieske 2018), (Haimes and Sage 2015) and (Aven and Zio 2018). ...
Conference Paper
Full-text available
Support the local industry and manufacturing has received a high degree of interest in Saudi Vision 2030. Manufacturing is linked to supply chain management as a totalitarian system where production is one of its parts. Recently, industrial engineering tools have received a high degree of interest as a branch of engineering science. It is concerned with the problems of production systems, in such a way as to achieve the best environment in which human beings interact with the machine in a highly efficient production unit. The tools of industrial engineering that can be applied vary, including statistical, economical, work, reengineering, operations, simulation, feasibility, planning, risk, forecasting, quality, value engineering, and management tools. It has been possible to deduce how industrial engineering tools contribute to finding optimal solutions and positively influence many of the problems facing responsible people in industry and supply chains. Positive impacts include: Studying the project idea; choice of project location; opportunities for gain or loss; insight into the future status of the institution; evaluation of suppliers; maintaining of staff and employees and creating an appropriate environment to increase their productivity; calculating the optimal ordering size and time of inventory demand; the radical change in design and development processes in order to improve productivity; achieving quality of design, manufacturing, and after-sales services; improvements to enhance work status; allocation of machinery and labor forces; forecasting demand in the short and long terms; optimal distribution of transport means and road selection; optimal choice of advertising means.
... A useful construct is to divide risk analysis into two components: (i) risk assessment (identifying, evaluating and measuring the probability and severity of risks) and (ii) risk management (deciding what to do about risks) (Haimes, 2015). Risk analysis can be qualitative or quantitative in nature: the former uses words or colours to identify and evaluate risks or presents a written description of the risk, while the latter calculates numerical probabilities over the possible consequences (Rausand, 2011). ...
Chapter
Full-text available
Structured abstract Purpose: This chapter aims to present the key issues and main aspects of risk management, as they relate to tourism entrepreneurship, with a focus on the risk management plan and the various strategies used in controlling risks. Design/methodological/approach:Literature review was conducted and managerial issues and aspects regarding RM in tourism entrepreneurship were highlighted. These issues were illustrated by one example and two case studies from the business world. Findings:This chapter suggests that every probable risk must have a pre-formulated plan to deal with its possible consequences.In the field of tourism entrepreneurship, the elimination of risk by putting safety measures in place is not simply achieved by taking precautions in a haphazard manner. Rather, these tasks require a proactive approach, an intricate and logical plan. Research limitations:This chapter is explorative in nature, based on a literature review and case study analysis. It takes more entrepreneurial / practical than academic approach. Managerial/practical implications: The chapter provides arisk management process as a generic framework for entrepreneurs / managers in the identification, analysis, assessment, treatment and monitoring of riskrelated to their business ventures. It also suggests theappropriate steps to followto efficiently managing risks. Every tourism enterprise should have a strategy and an emergency/contingency plan to address risks. Originality / value: This chapter outlines,in a comprehensive and practical way,a strategic approach to risk management for the tourism enterprises. It also highlights the importance and utility of planning and implementing ofa suitable strategy to effectively address the business-related risks.
... From the risk assessment standpoint, key notions are uncertainty, likelihood (probabilities), impact (consequences). Depending on nature of uncertainties and information available various qualitative and quantitative or semi-quantitative methods can be used for the risks description and assessment (Karmen et al., 2019;Haimes & Sage, 2015;Simmons et al., 2017). ...
Article
Steady increase in renewable energy production and supply allows gradually substitute environmentally harmful traditional energy systems. Developers of the renewable projects encounter various types of risks, inherent to these projects, and all these risks should be studied in advance and ways of their mitigation developed. In the paper risks related to the development of renewables in Azerbaijan are analyzed and assessed based on experts’ opinion study. Typical for the projects on renewable energy, nine risks and risk components likelihood and their impacts have been evaluated by experts and, based on their opinion, risk levels are calculated, and a risk profile is constructed. In general, risks are sufficiently different. However, energy policy-related, grid access and financial risks are significantly influential and require more attention.
... The number and type of risks in institutional life are ever increasing especially in response to globalization and technological changes which are impacting on both the social and economic fundamentals of all countries. As such, numerous of studies (for example, Vose, 2008;Shrader-Frechette, 2012;Haimes, 2015;Reason, 2016) have attempted to identify and analyse the numerous risks that affect the success of institutions. In the realm of HETIs, a number of risks are shown to prominently characterize institutional performance, namely, student lack of effort (Gough, 2010;Pompa, 2014), poor previous schooling (Bonzet, 2017). ...
Article
Full-text available
Over the past decades, governments in Developing countries have continuously increased funding of their higher education institutions without corresponding higher education institutions performance improvement. While a lot research on developed countries HEIs clearly indicate the performance risks, it seems very little research has focused on the performance risks in Developing countries' HETIs. As a result many unknown risks continue to impinge on the performance of developing countries' HETIs. This study aimed to assess the performance risks facing HETIs in Developing countries with a view to developing a risk management framework for HETIs in Developing countries. A quantitative study employing a structured questionnaire was carried out on randomly selected respondents from the Higher Education sector in South Africa.This was also supported by an extensive documentary analysis of records from the sector. The general mean of the sub-factors was computed and ranked to find the significance of of discovered performance risks in HEI in South Africa. The results provide an understanding of the innumerable performance risks demonstrating among others that academic support and attitude of students pose some serious challenges to the performance management in HETI sector in a developing country. The findings have practical implications for the management of performance in adeveloping country's HETI which are contrary to established norms in developed countries. This suggests a need for a different approach to managing performance risks in developing countries.
... First of all, it should be simple enough to allow understanding and substantive interpretation of a phenomenon or a process, which is extremely difficult in the case of such complex processes as transport and conversion of pollutants into the environment. Secondly, it should be designed to accurately capture and describe the course of a phenomenon or a process [6,7]. These models can be used to simulate the effects of long-term actions on a selected catchment area. ...
... Uncertainty can also be regarded as the inability to ascertain the exact state of a system (Haimes, 1998); which suggests that probability cannot be measured. Furthermore, Pender (2001) opined that risk applies to situations where there is a probability of repetition and replicability, while uncertainty connotes situations where no prior knowledge exists because replicability and future occurrence cannot be categorised based on past precedence. ...
Article
Full-text available
This paper provides an in-depth examination of various concepts related to the forms and sources of uncertainty, as well as the management of uncertainty in real estate development (RED). The study also examines factors influencing the adoption of Real Option Analysis (ROA) in RED given the need to improve the knowledge of stakeholders in RED appraisal, and to ensure best practices. Based on desktop analysis of past authors’ perspectives, orientations and submissions regarding the management of uncertainty in RED appraisal, the findings reveal that while there are varying forms and sources of uncertainty in RED appraisals, there are also diverse methods used to manage the uncertainty of it. It is, however, noted that the methods employed are dependent on RED appraisers and other institutional factors. The consensus from previous studies favours ROA in managing uncertainty in RED. This paper adds to the debate for the need to embrace ROA in managing the effects of uncertainty in RED appraisal.
Chapter
Electricity, communications, and water are three safety‐critical sectors of every country's economy and of its population's well‐being. This chapter introduces the interdependencies and interconnectedness (I‐I) that exist among electricity, communications, and water Complex systems of systems (Complex SoS). It focuses on the consequences resulting from the I‐I of all sectors of a country's economy via the inoperability input–output model (IIM). The IIM builds on and extends the input–output model developed by the Nobel laureate Wassily Leontief. Leontief's input–output model describes the equilibrium behavior of both regional and national economies. Regional decomposition enables a more focused and more accurate analysis of interdependencies for regions of interest in the United States. Regional IIMs can be interconnected to develop a multiregional version that improves spatial explicitness, model flexibility, and analysis coverage. The available databases of interdependency statistics provide an essential foundation for applying the IIM to model the economic consequences of emergent forced changes or a terrorist attack.
Article
Distribution network expansion planning is one of the most important problems in power systems. Modeling distribution network expansion planning in the presence of distributed generators needs new requirements to model the renewable energies, electricity price, and demand uncertainties. In this paper, 2 risk indices are proposed: (1) If a scenario is flexible, it requires a relatively low investment, introduced as adjustment cost, to attain the efficiency of the optimal solution in other scenarios (ie, flexibility criterion) and (2) if the maximum distance of a solution is minimum from the optimal solution of all scenarios, then the scenario is robust (ie, robustness criterion). The resulted model is a multiobjective mixed‐integer nonlinear problem, which is solved using nondominated sorting improved harmony search algorithm‐II. To obtain the final plan, the fuzzy decision‐making analysis is applied, and to demonstrate the effectiveness of the model, it was applied to 9‐node and 69‐node distribution systems.
Article
Full-text available
In water resource system risk research, the risk identification problem should be addressed first, due to its significant impact on risk evaluation and management. Conventional risk identification methods are static and one-sided and are likely to induce problems such as ignored risk sources and ambiguous relationships among sub-systems. Hierarchical holographic modelling (HHM) and Risk filtering, ranking, and management (RFRM) were employed to identify the risk of water resources system. Firstly, water resource systems are divided into 11 major hierarchies and 39 graded holographic sub-subsystems by using the HHM framework. Iteration was applied on 4 graded holographic sub-subsystems, which were decomposed from water resource system in the time-space domain, to accurately identify 30 initial scenarios. Then, on the basis of RFRM theory, the risk probabilities of the initial scenarios are calculated and ranked, and 13 high risk scenarios are identified. Finally, the quantifiable 33 risk indicators that characterize the risk scenario are presented. Research results show that the risks affecting the water resources system include the composition, quantity, quality, and management of water resources, which involve many factors such as hydrology, human resources, resource allocation, and safety. Additionally, the study gives quantitative indicators for responding to high-risk scenarios to ensure that high-risk scenarios are addressed first, which is significant for the subsequent evaluation and management of water resource system risk.
Article
We develop a behavioral–decision model to highlight entrepreneurs’ decision making behind venture opportunism. We find that opportunism can present to entrepreneurs and their new ventures a risky yet beneficial choice to secure short–term gains at potential social costs. We posit that, motivated by loss aversion, entrepreneurs may accept the risk and engage in opportunism when their ventures confront economic losses. For instance, a high risk of venture failure may motivate entrepreneurs to act opportunistically in the hope that the failure can be averted. We further posit that such loss–averse decisions will be moderated by the entrepreneurs’ personal bonds to their new ventures. That is, the scale of entrepreneurs’ personal investment in their ventures will intensify their economic loss aversion posed by venture failure risk. In contrast, when entrepreneurs use their personal social capital to support their ventures, they will personally bear more of the down–side risks of opportunistic behavior and thus be less likely to act opportunistically to countervail a potential economic loss. Results based on the data collected from 244 NEEQ–listed new ventures in Beijing and Tianjin in China support our predictions.
Chapter
Creative methods may be partitioned along two axes, divergent versus convergent creative methods and creative methods primarily appropriate to individuals versus creative methods primarily appropriate to teams. This chapter describes a collection of divergent creative methods for individuals. It includes a collection of divergent creative methods for teams. The chapter also describes a collection of convergent creative methods for individuals and explains convergent creative methods for teams. Other creative methods includes: process map analysis, nine screens analysis, technology forecasting, design structure matrix analysis, failure mode effect analysis, anticipatory failure determination, and conflict analysis and resolution. Creative methods were selected, first and foremost, on the basis of their applicability to engineers, particularly systems engineers. Another consideration was the feasibility of implementing each creative method by practicing engineers doing their business along the entire systems' life cycle.
Article
Microgrids are interconnected distributed energy generation and storage systems that can act as either an extension of the existing grid or operate independently of the grid, in so-called "island mode" [1]. As a power supply technology, microgrids are attractive as they can be more reliable than existing infrastructure and reduce societies? reliance on nonrenewable energy sources [1]. As a valid alternative to reliance on centralized electricity generation and grid distribution, microgrids have value that materializes when disasters render grids inoperable and microgrids remain as islands of service. It is these specific disaster conditions that reveal the latent value of microgrids as disaster management tools, whereas under nondisaster conditions, their value remains hidden. Microgrids can thus be leveraged to manage disaster relief efforts while remaining connected to donor sources and the broader humanitarian relief supply chain.
Article
Large‐scale logistics systems operate under uncertainties of schedule, cost, environmental impacts, reliability, and others rendering it critical for system operations to consider emergent and future conditions involving markets, technologies, environment, and organizations. Business process modeling is used widely to document the activities of an enterprise. Successful analysis of business processes requires explicit accounting for and evaluation of sources of potential disruptive risk. Previous research in the journal integrated risk identification to business process models. This paper creates a framework that evaluates the schedule disruption potential of identified sources of risk in logistics systems with the modeling of the associated business processes. The framework is demonstrated on a scheduling process at a marine container port. The methods first described in this paper should be integrated to software applications that diagram and analyze business processes of large‐scale systems.
Chapter
The authors incorporate multiple decompositions from multiple perspectives supported and populated with the Bayesian data analysis. This modeling theory, philosophy and methodology, integrates all the direct and indirect relevant information from different levels of the hierarchies while placing more emphasis on relevant direct data. The authors coordinate the results from different decompositions and perform quantitative modeling of complex systems of systems (Complex SoS) supported with multiple databases. Then, they build on hierarchical coordinated Bayesian modeling (HCBM) and the partitioning multiobjective risk method (PMRM) for risk‐based decision analysis of Complex SoS. The authors demonstrate embedding Bayes' theorem with Bellman's principle of optimality in dynamic programming for the purpose of resource allocation for intelligence gathering in countering terrorism. Finally, they discuss a two‐track intelligence collection strategy: One team aims to maximize the posterior probabilities of occurrence and one team aims to minimize the posterior probabilities of occurrence.
Chapter
Protection of complex systems of systems (Complex SoS) may include a variety of risk related countermeasures, such as detection, prevention, hardening, and containment. These are all important risk management policy options aimed at increasing safety and security. Strategic preparedness refers to actions performed before a disaster as well as to the level of risk that results from such actions. Hierarchical holographic modeling (HHM) is a conceptual, functional, and systemic graphical representation of a system aimed at describing its essence and its inherent diverse characteristics and attributes. Both vulnerability and resilience are manifestations of the states of each system and of the Complex SoS as a whole. Preparedness includes the application of a mixture of preventive actions and measures to assure resilience. This chapter discusses the resilience of critical infrastructures as an emergent property of a nation's critical infrastructure Complex SoS.
Chapter
Effective modeling of interdependent and interconnected complex systems of systems commonly rely on probability and uncertainty analyses for both policy formulation and the subsequent decision‐making process. Several methodologies have been established in the literature for handling large modeling and optimization problems via decomposition and hierarchical coordination schemes. This chapter discusses several examples of regional planning and management of water resources systems via a hierarchical‐multilevel approach. It explores the importance of incorporating risk and uncertainty in modeling and policy formulation and in the decision‐making process. By applying decomposition and multilevel decision‐making methods, there is no costly sacrifice of realism in modeling since more representative and sophisticated nonlinear mathematical models can be constructed. The chapter demonstrates the efficacious contributions of risk and uncertainty to the intricate decision‐making process by incorporating new data through Bayes' theorem and with Bayesian analysis.
Chapter
The global supply chain complex systems of systems (Complex SoS) is the backbone of the U.S. and the economy of every country. This chapter addresses the adversarial entities of society that espouse to the disruption of the supply chain and the associated risk analysis to be considered to ensure its viability. The multifaceted nature of the supply chain sector requires its extensive modeling, and understanding of the safety‐critical sector of every country's economy as interdependent and interconnected Complex SoS. The Inoperability Input–Output Model (IIM) enables the modeling of the interdependent and interconnected sectors of the economy and quantifies the adverse consequences on all affected supply chain sectors. The chapter also addresses the role of organizational infrastructure to effective operations and delivery of the Supply Chain Complex SoS. A multilayer organizational hierarchy is one of the most critical challenges used in streamlining complex decision‐making policies and protocols associated with the Supply Chain Complex SoS.
Chapter
Organizational structures of interdependent and interconnected complex systems of systems (Complex SoS) vary so widely, especially among the private and public sectors, that they may be branded from their modeling perspectives as an unbounded set. Modeling dynamic systems necessarily requires comparably dynamic models. Interpersonal and intraorganizational relationships are dynamic and multifarious, a fact that makes the modeling of organizational Complex SoS a challenging enterprise, and from many perspectives opaque. There is a strong interplay between effective policy formulation and decision making that applies to, and equally propels, the success of both the public's harmonious aspirations and the organizational mission and its success. Cyber security is a generic term that connotes the complex state of reliability and confidence in the sanctity of the information delivered by a network of computer Complex SoS. This chapter defines the resilience of a cyber–physical (CP) Complex SoS as the ability of the system to recover from a cyber intrusion.
Article
Full-text available
Many evaluation methods have been used to assess the usefulness of Visual Analytics (VA) solutions. These methods stem from a variety of origins with different assumptions and goals, which cause confusion about their proofing capabilities. Moreover, the lack of discussion about the evaluation processes may limit our potential to develop new evaluation methods specialized for VA. In this paper, we present an analysis of evaluation methods that have been used to summatively evaluate VA solutions. We provide a survey and taxonomy of the evaluation methods that have appeared in the VAST literature in the past two years. We then analyze these methods in terms of validity and generalizability of their findings, as well as the feasibility of using them. We propose a new metric called summative quality to compare evaluation methods according to their ability to prove usefulness, and make recommendations for selecting evaluation methods based on their summative quality in the VA domain.
Chapter
All complex systems of systems (SoS) are guided and driven by multiple, often competing and conflicting, goals and objectives. This chapter discusses uniqueness of multiple goals and objectives to Complex SoS. It then explains how to solve the multiobjective optimization problems using the surrogate worth tradeoff (SWT) method. This method provides the interface between the decision making (DM) and the mathematical model. Then, the chapter provides examples of Complex SoS with multiobjectives. It also addresses the challenges facing participants like stakeholders in modeling and managing the development of ongoing emergent Complex SoS, with a focus on the centrality of state variables and time frame. More specifically, it addresses the critical role that shared (common) states and decisions and the time frame play in modeling the interdependencies and interconnectedness (I‐I) among the subsystems that constitute emergent Complex SoS.
Chapter
Physical infrastructures, serving as the foundations of society's well‐being and encompassing household and the entire private and public sectors, are the driving force of today's worldwide economic prosperity. This chapter focuses on fault trees and fault‐tree analyses for modeling transportation Complex Systems of Systems (SoS). Transportation is an emergent safety‐critical interdependent and interconnected sector of the economy, which in its essence constitutes Complex SoS. The chapter also focuses primarily on how automobile accidents occur and addresses the broader challenge of quantifying and managing the risk inherent in particular automobile designs and automobiles and the environment within which they operate as Complex SoS. The function of the Vehicle Complex SoS is to physically transport passengers in safety and comfort according to the directional and speed instructions given to it by the driver. Fault‐tree modeling of accident occurrence is an effective framework for describing accident causation in both quantitative and qualitative terms.
Article
Full-text available
The aquifer of the Oltrepò Pavese plain (northern Italy) is affected by paleo-saltwater intrusions that pose a contamination risk to water wells. The report first briefly describes how the presence of saline water can be predicted using geophysical investigations (electrical resistivity tomography or electromagnetic surveys) and a machine learning tool specifically developed for the investigated area. Then, a probabilistic graphical model for addressing the risk of well contamination is presented. The model, a so-called ‘influence diagram’, allows researchers to compute the conditional probability that groundwater is unsuitable for use taking into account the results of the geophysical surveys, the predictions of the machine-learning software, the related uncertainties and the prior probability of contamination in different sectors of the plain. The model, in addition, allows for calculation and comparison of the expected utility of alternative decisions (drilling or not drilling the well, or using another water source). The model is designed for use in ordinary decision situations and, although conceived for a specific area, provides an example that may be adapted to other cases. Some adaptations and generalizations of the model are also discussed.
Chapter
Risks are part of every business activity and risk management is therefore essential in organizing business for success. Our paper describes a methodological framework for identifying and managing risks in the context of business process analysis. The method includes business process modeling and simulation not only to organize activities but also to locate errors and manage risks. Simulating business processes supports managers in compliance analysis by investigating different scenarios of norms and regulations. We adopt as a case study an healthcare application where risk management is even more important, because errors not only increase costs, reduce quality, and lead to delays, but can also lead to serious damages. This work demonstrate how to afford risk management by using modeling and computational simulation, also by improving regulatory compliance.
Article
Full-text available
The fundamental design and inherent capabilities of the Graph Model for Conflict Resolution (GMCR) to address a rich range of complex real world conflict situations are put into perspective by tracing its historical development over a period spanning more than 30 years, and highlighting great opportunities for meaningful future expansions within an era of artificial intelligence (AI) and intensifying conflict in an over-crowded world. By constructing a sound theoretical foundation for GMCR based upon assumptions reflecting what actually occurs in reality, a fascinating story is narrated on how GMCR was able to expand in bold new directions as well as take advantage of many important legacy decision technologies built within the earlier Metagame Analysis and later Conflict Analysis paradigms. From its predecessors, for instance, GMCR could benefit by the employment of option form put forward within Metagame Analysis for effectively recording a conflict, as well as preference elicitation techniques and solution concepts for defining chess-like behavior when calculating stability of states from the realm of Conflict Analysis. The key ideas outlined in the paper underlying the current and projected capabilities of GMCR include the development of four different ways to handle preference uncertainty in the presence of either transitive or intransitive preferences; a wide range of solution concepts for describing many kinds of human behavior under conflict; unique coalition analysis algorithms for determining if a given decision maker can fare better in a dispute via cooperation; tracing the evolution of a conflict over time; and the matrix formulation of GMCR for computational efficiency when calculating stability and also theoretically expanding GMCR in bold new directions. Inverse engineering is mentioned as an AI extension of GMCR for computationally determining the preferences required by decision makers in order to reach a desirable state, such as a climate change agreement in which all nations significantly cut back on their greenhouse gas emissions. The basic design of a decision support system for permitting researchers and practitioners to readily apply the foregoing and other advancements in GMCR to tough real world controversies is discussed. Although GMCR has been successfully applied to challenging disputes arising in many different fields, a simple climate change negotiation conflict between the US and China is utilized to explain clearly key concepts mentioned throughout the fascinating historical journey surrounding GMCR.
Article
Managing risks to critical infrastructure systems requires decision makers to account for impacts of disruptions that render these systems inoperable. This article evaluates dock-specific resource allocation strategies to improve port preparedness by integrating a dynamic risk-based interdependency model with weighted multicriteria decision analysis techniques. A weighted decision analysis technique allows for decision makers to balance widespread impacts due to cascading inoperability with certain industries that are important to the local economy. Further analysis of the relationship between inoperability and expected economic losses is explored per commodity flowing through the port, which allows an understanding of cascading impacts through interdependent industries. Uncertainty is accounted for through the use of probability distributions of total expected loss per industry that encompass the uncertainty of the length of disruption and severity of the impact that is mitigated by alternative strategies. A set of discrete allocations options of preparedness plans is analyzed in a study of the Port of Catoosa in Oklahoma along the Mississippi River Navigation System. The economic loss analysis showed that the integration of multicriteria decision analysis helps in prioritizing strategies according to several criteria such as gross domestic product (GDP) and decision maker risk aversion that are not typically addressed when strategies are prioritized according to the average interdependent economic losses alone.
Article
A successful Denial of Service attack on a CI can indirectly have devastating and irreversible effects to those that depend on its services. Furthermore, recent disruptions have raised concerns regarding the resiliency, security effectiveness and emergency preparedness of CIs and dependent resources. To address the persistent challenge of protecting CIs and maintaining the essential services they provide, this research offers emergency management personnel a conceptual framework to evaluate security effectiveness and estimate the cascading effects that may result from inadequate security measures. We combine the philosophy of multi-dimensional modeling, with the statistical engine of Bayesian Belief Networks to provide proactive, scenario-based interdependency analysis for CI protection and resiliency. The findings of this research resulted in a multi-dimensional approach that enables a heightened awareness of one’s risk-posture by highlighting the existence (strength) or absence (weakness) of relevant security factors. Through stakeholder risk-assessment, preemptive implementation of threat mitigation plans for dependent resources are permissible. Specifically, we provide this proof of concept, “what-if” analysis tool to assist in the reduction of vulnerabilities. To illustrate the conceptual framework, we provide a Healthcare and Public Health sector case study that evaluates the impact to a hospital patient given a successful DoS attack on a CI.
Article
Full-text available
El presente artículo propone un modelo de seguridad pública que vincule a los centros comerciales de Colombia, como parte de la infraestructura crítica del Estado. Dentro del análisis de los sectores de infraestructuras críticas, se determinó el impacto que los centros comerciales representan dado el alto tráfico que ostentan frente al riesgo de ataque terrorista. Como reflexión final, se plantea que la legislación orientada a las infraestructuras críticas, la inteligencia y las alianzas público-privadas son el primer paso para un modelo de protección de las infraestructuras críticas colombianas, de cara a la prevención del riesgo de terrorismo.
Article
Full-text available
Understanding influent water quality variability is essential for the long-term planning of potable water systems. To quantify variability and generate realistic influent scenarios, we propose a nonparametric time series approach based on k-nearest neighbor (k-NN) bootstrap resampling. The k-NN approach resamples historical data conditioned on a “feature vector” at a given time to generate values at subsequent times. We modified this algorithm by adding random perturbations to the resampled values to generate realistic extremes unobserved in the historical record. k-NN is widely used in stochastic hydrology and hydroclimatology; however, it is adapted here for the multivariate, data-limited context of water treatment. To examine the performance of the algorithm, we applied it to an eleven-year, monthly water quality dataset of alkalinity, temperature, total organic carbon, and pH from the Cache la Poudre River in Colorado. We found that the k-NN simulations captured the relevant distributional statistics of the historical record, which suggests that the algorithm produces realistic and varied scenarios. When used in conjunction with modeling and optimization, these scenarios have the potential to improve the sustainability, resilience, and efficiency of potable water systems.
Chapter
Politische Risiken sind mögliche Störungen der geschäftlichen Aktivitäten internationaler Unternehmen durch politische Kräfte und Ereignisse, die im Gastland, im Heimatland oder aus Veränderungen in der internationalen Umgebung entstehen. Dazu gehören soziale Konflikte und politische Gewalt sowie nachteilige staatliche Eingriffe und Regulierungen bis hin zur Enteignung. Die politische Risikoanalyse berät Unternehmen dabei, die für sie relevanten Risikoszenarien zu ermitteln und zu priorisieren. So können die größten politischen Risiken durch gezielte Maßnahmen zur Risikokontrolle und Schadensbegrenzung gemanagt werden. Der Beitrag führt in den Begriff des politischen Risikos ein und stellt verschiedene Untersuchungsansätze und Methoden der politischen Risikoanalyse vor. Abschließend werden Fragen des politischen Risikomanagements angesprochen.
Chapter
This chapter develops models for risk to interdependent cyber–physical complex systems of systems (Complex SoS) that builds on the shared/common states and other essential entities among the systems and subsystems that constitute Complex SoS. It presents four case studies that focus on safety‐critical cyber–physical systems and infrastructures and they utilize state‐space theory and methodology of modeling and managing Complex SoS. The Federal Aviation Administration (FAA) is planning the integration of three major systems – communications (C), navigation (N), and surveillance (S) – into one complex systems of systems (CNS Complex SoS). The fault‐tree model provided a demonstration of the premise that users of cloud‐computing technology (CCT) Complex SoS are more at risk than users of non‐CCT systems. The concept of interdependent subsystems with shared states and the resulting interdependent failure probabilities can be used to analyze the reliability of other Complex SoS.
Conference Paper
Linear programming deals with problems such as (see [ 4], [ 5]): to maximize a linear function $$ \rm g{x}\equiv \sum {c_{i}x_{i}} \; \rm {of} \; n \;\rm{real \; variables} \; x_{1},...,x_{n} $$ (forming a vector x) constrained by m + n linear inequalities.
The problem considered is that of obtaining solutions to large nonlinear mathematical programs by coordinated solution of smaller subproblems. If all functions in the original problem are additively separable, this can be done by finding a saddle point for the associated Lagrangian function. Coordination is then accomplished by shadow prices, with these prices chosen to solve a dual program. Characteristics of the dual program are investigated, and an algorithm is proposed in which subproblems are solved for given shadow prices. These solutions provide the value and gradient of the dual function, and this information is used to update the shadow prices so that the dual problem is brought closer to solution. Application to two classes of problems is given. The first class is one whose constraints describe a system of coupled subsystems; the second is a class of multi-item inventory problems whose decision variables may be discrete.
The graphical solution to the optimization problem posed by Eqs. (A.74) to (A.76) yields xl* = 2, x2* = 2,f(xl*, x2*) = 8 (the reader is encouraged to derive this solution) It is evident that the constraint gl (XI, x2) is binding
  • Graphical Solution
Graphical Solution, The graphical solution to the optimization problem posed by Eqs. (A.74) to (A.76) yields xl* = 2, x2* = 2,f(xl*, x2*) = 8 (the reader is encouraged to derive this solution). It is evident that the constraint gl (XI, x2) is binding, whereas g2 (xl, x2) is not.
A constructive proof of the Kuhn-Tucker multiplier rule
  • E J Beltrami
Beltrami, E.J., 1968, A constructive proof of the Kuhn-Tucker multiplier rule, Proceedings, SIAM National Meeting, Toronto.
  • M D Intrilligator
Intrilligator, M.D., 197 1, Mathematical Optimization and Economic Theory, Prentice Hall, Englewood Cliffs, NJ.