Article

On “Black Swans” and “Perfect Storms”: Risk Analysis and Management When Statistics Are Not Enough

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Two images, "black swans" and "perfect storms," have struck the public's imagination and are used-at times indiscriminately-to describe the unthinkable or the extremely unlikely. These metaphors have been used as excuses to wait for an accident to happen before taking risk management measures, both in industry and government. These two images represent two distinct types of uncertainties (epistemic and aleatory). Existing statistics are often insufficient to support risk management because the sample may be too small and the system may have changed. Rationality as defined by the von Neumann axioms leads to a combination of both types of uncertainties into a single probability measure-Bayesian probability-and accounts only for risk aversion. Yet, the decisionmaker may also want to be ambiguity averse. This article presents an engineering risk analysis perspective on the problem, using all available information in support of proactive risk management decisions and considering both types of uncertainty. These measures involve monitoring of signals, precursors, and near-misses, as well as reinforcement of the system and a thoughtful response strategy. It also involves careful examination of organizational factors such as the incentive system, which shape human performance and affect the risk of errors. In all cases, including rare events, risk quantification does not allow "prediction" of accidents and catastrophes. Instead, it is meant to support effective risk management rather than simply reacting to the latest events and headlines.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Na terminologia da gestão de riscos (2) ela é utilizada para referir-se à chance de algo acontecer, não importando se definida, medida ou determinada objetiva ou subjetivamente, qualitativa ou quantitativamente, ou se descrita utilizando-se termos gerais ou matemáticos. 9 (2009), são muito comuns na área e que envolvem a necessidade estratégica tanto de "fazer a coisa certa" quanto a necessidade operacional de "fazer as coisas da forma certa". ...
... Uma vez compreendidos os benefícios de implantar a gestão de riscos em uma organização, é relevante apresentar alguns princípios para que esse tipo de gestão possa ser adotado de forma eficaz. Para isso, enumeramos o levantamento feito pela ABNT (2009) i) é transparente e inclusiva; j) é dinâmica, iterativa e capaz de reagir a mudanças; k) facilita a melhoria contínua da organização. Estados Unidos e responsável pelo treinamento de militares e civis no que diz respeito à processos de compras de equipamentos de defesa, logística e tecnologia militar. ...
... Além das medidas possíveis para tratamento dos riscos, Summers e Boothroyd (2009) propõem também respostas possíveis para lidar com oportunidades: ...
Thesis
Full-text available
This dissertation aims to investigate the application of risk management practices on the evaluation of Brazilian graduate programs, notably for the accreditation process adopted to assess new course proposals. Considering the growth in the number of master's and doctoral courses in the country, as well as the need to minimize losses due to the failure witnessed in some of them, risk management is proposed as a tool to identify obstacles to the development of such courses. Based on this, we believe it is possible to understand and monitor risks in graduate courses, providing the Brazilian Federal Agency for Support and Evaluation of Graduate Education (CAPES) with material information to help mitigate risks in these programs. To achieve that goal, this research analyzed unwanted results from the recent Quadrennial Evaluation of graduate courses, which took place in 2017: stagnation, involution, and discontinuity of graduate programs. Based on these observations, we have selected relevant indicators to investigate the possibility of identifying risks by a quantitative analysis. For this, data were obtained from sources such as CAPES, MEC, INEP, IBGE, PNUD, and CGEE. To clean and explore such data, the statistical package R and Tableau Desktop software were used, also allowing a complex analysis through the application of statistical and Machine Learning techniques, which included Naive Bayes algorithms, Logistic Regression, and Classification and Regression Trees. From such investigation, we identified the need to combine quantitative risk analysis with qualitative methods to obtain reliable results. Therefore, we propose a methodology to develop a risk register from the analysis of new course proposals, which is usually performed by committees of ad hoc consultants. With this, we hope that a cycle of identification, mapping, and recording of risks, followed by proper monitoring of graduate programs, will play a relevant role in providing better conditions for the continuous success of the Brazilian System of Graduate Courses (SNPG).
... On the other hand, the terminology of "unprecedented" events, and the implication that the current historical context is fundamentally different from the past (Kelman, 2014), can be leveraged to imply that disasters can be neither mitigated nor planned for, and to excuse poor preparation (Paté-Cornell, 2012;Hewitt, 2019). This situation had led some authors to suggest that the very concept of a "Black Swan" may be a "red herring" (Shaw, 2014), since the question of whether or not an event could have been foreseen is more a question of imagination and approach, than of genuine unpredictability. ...
... Many authors (e.g., Altinok and Ersoy, 2000;Tinti et al., 2005;Synolakis et al., 2011;IOC, 2020) have recognised that such an event would likely lead to a disaster, and have more severe consequences than in 365CE, particularly given high and increasing coastal population densities ( > 1000 people per km 2 in the Nile Delta; Figure 1B). This recognition has led to significant research into the hazard component of the disaster (e.g., Papazachos, 1996;Shaw et al., 2008;Synolakis et al., 2011;Valle et al., 2014;England et al., 2015;Necmioglu and Özel, 2015;Howell et al., 2017). However, consideration of the socio-political and economic consequences has predominantly been limited to specific geographic regions (such as Crete; Synolakis et al., 2011, or EU member states; International Bank for Reconstruction and Development/The World Bank, 2021), or industries (e.g., oil; Cruz et al., 2011). ...
... April 2022 | Volume 10 | Article 742016 2017, also referred to as the "availability heuristic" in psychology; Tversky and Kahneman, 1973) in the case studies invoked as justifications (Section 3.4), which were dominantly from the last 20 years. This focus on the contemporary might mean that insights from older events are missed, and lead to the fallacy of planning for the most recent disaster, rather than anticipating the next (Ewing and Synolakis, 2011;Paté-Cornell, 2012). The scenarios generated in this workshop, therefore, have components consistent with pre-existing analyses of risk in the Eastern Mediterranean, but have a greater emphasis on unquantifiable risks, and are, therefore, potentially useful in identifying risks which might not currently be considered in individual risk analyses. ...
Article
Full-text available
How to recognise potential disasters is a question at the centre of risk analysis. Over-reliance on an incomplete, often epistemologically-biased, historical record, and a focus on quantified and quantifiable risks, have contributed to unanticipated disasters dominating both casualties and financial losses in the first part of the 21st century. Here we present the findings of an online workshop implementing a new scenario-planning method, called downward counterfactual analysis, which is designed to expand the range of risks considered. Interdisciplinary groups of disaster researchers constructed downward counterfactuals for a present-day version of the 365CE Cretan earthquake and tsunami, imagining how these events might have been worse. The resulting counterfactuals have trans-national, long-term impacts, particularly in terms of economic losses, and connect risks previously identified in separate sectors. Most counterfactuals involved socio-political factors, rather than intrinsic components of the hazard, consistent with the idea that there are “no natural disasters”. The prevalence of cascading counterfactuals in our workshop suggests that further work is required to give the appropriate weight to pre-existing economic and social conditions in scenario-planning methods, such as downward counterfactual analysis, which focus on the occurrence of a hazard as the temporal starting point for a disaster. Both proposed counterfactuals and their justifications reflect a bias towards contemporary issues and recent historical disasters. We suggest that interdisciplinary groups can expand the range of imagined risks. However, the setup used here would be improved by including local stakeholders. Qualitative forms of downward counterfactual analysis have potential applications for community engagement and education, as well as for risk analysis.
... The metaphor of perfect storms has been discussed in relation to black swans by Paté-Cornell [59] and Aven [6]. Aven states that "In relation to perfect storms, the variation in the phenomena is known and we face risk problems where the uncertainties are small; the knowledge base is strong and accurate predictions can be made" ( [6], p. 122). ...
... Hence, there is a link to both black swans and grey swans. The perfect storm metaphor has been discussed in relation to the black swan metaphor by Paté-Cornell [59], who distinguishes between the two by relating them to different types of uncertainties: "'Perfect storms' involve mostly aleatory uncertainties (randomness) in conjunctions of rare but known events. 'Black swans' represent the ultimate epistemic uncertainty or lack of fundamental knowledge" ( [59], p. 1827). ...
... The perfect storm metaphor has been discussed in relation to the black swan metaphor by Paté-Cornell [59], who distinguishes between the two by relating them to different types of uncertainties: "'Perfect storms' involve mostly aleatory uncertainties (randomness) in conjunctions of rare but known events. 'Black swans' represent the ultimate epistemic uncertainty or lack of fundamental knowledge" ( [59], p. 1827). In line with this reasoning, Aven [6] discusses perfect storms in relation to the type c) black swan mentioned in Section 2.1 (surprising, extreme events not believed to occur because of very low judged probability), stating that perfect storms are events that can be predicted (using frequentist probabilities) with large accuracy and small uncertainty, whereas black swans of type 3 are described using subjective probabilities and cannot be predicted with this level of accuracy. ...
Article
Different metaphors have been introduced to reflect the occurrence of rare and surprising types of events with extreme impacts, including black swans, grey swans and dragon-kings. Despite considerable research on clarifying the meaning of these concepts, their relationship still remains unclear. The present paper aims to meet this challenge, by reviewing current definitions and interpretations found in the literature and referred to in practice, analysing these definitions and interpretations, and providing a structure for improved understanding of the differences and similarities between the various metaphors. The paper also discusses some of the implications the use of these concepts has for risk management and decision-making.
... Major technological accidents seemingly coming out of the blue are sometimes labeled "Black Swan" events to explain why they could not be prevented (e.g. the accident in Bhopal in 1984 (Murphy and Conner, 2012) or the Deepwater Horizon oil spill (Lodge, 2010)). Black Swans are extreme outliers impossible to foresee (Taleb, 2010), and as such can provide an excuse for risk owners and managers for why no or insufficient risk management measures were taken prior to an accident (Paté-Cornell, 2012). For instance, after the Deepwater Horizon accident the CEO of Exxon reportedly accused BP of doing a great disservice to international oil companies by suggesting that the disaster was not a Black-Swan event but that it instead had implications for the whole industry (Crooks, 2011). ...
... Secondly, it has to have a major impact, and thirdly, it can be explained in hindsight, making it appear predictable. Epistemic uncertainty is central to the Black-Swan concept, as such events express the ultimate lack of fundamental knowledge, representing "unknown unknowns" (Paté-Cornell, 2012). Aven and Krohn (2014) distinguish the following three interpretations or types of Black Swans: ...
... Even in case of strong signals, when the writing is on the proverbial wall, high-inertia risk management may fail to react fast enough to allow intervention before a loss occurs. Fear of false alerts or downright denial may mean that information is not communicated or that the potential severity of a situation is not believed at the decision-making level (Paté-Cornell, 2012). ...
Article
Full-text available
Technological accidents are a threat to the population, the environment and the economy. Occasionally, the notion of “Black Swan” event is applied to such accidents as an explanation for why they could not be prevented. By their very nature, Black Swans are considered extreme outliers which are impossible to anticipate or manage. However, technological accidents are generally foreseeable and therefore preventable when the associated risk is managed responsibly and when warning signs are not ignored. Consequently, such accidents cannot be considered Black Swans. We contend that the same holds for technological accidents triggered by natural hazards (so-called Natech accidents) which usually result from a lack of corporate oversight and insufficient application of state-of-the-art knowledge in managing the associated risk. We argue that the successful reduction of Natech risk requires a corporate mindfulness of the risk and the need to address it using updated approaches, the recognition that organizational behavior influences the risk significantly, and risk ownership that departs from the “Act-of-God” mindset which much of the discussion around natural hazards is fraught with. The study also highlights the importance of scientific research and knowledge management to reduce risks.
... Though the potential threat was identified, the precise event of COVID-19 was neither predicted nor prevented. In this sense, many epidemics may be considered black swan (low probability, high impact) events (Taleb, 2007;Paté-Cornell, 2012). ...
... Acknowledging that such events will occur but without knowing precisely what, when, or where (Plowright et al., 2017), we emphasize the importance of well-equipped food and public health systems (Paté-Cornell, 2012). A connected and resilient food system, which is crucial to support the nutrition and health needs of a thriving global population, depends on the ability of all countries to detect and respond to such events. ...
Article
Full-text available
Livestock production and global trade are key components to achieving food security, but are bedfellows with the risk for emergence and spread of infectious diseases. The World Trade Organization's Agreement on the Application of Sanitary and Phytosanitary Measures outlines provisions for member countries to protect animal, plant, and public health while promoting free trade. The capacity for risk analysis equips countries to increase access to export markets, improve local animal health and food safety regarding known hazards, and build the institutional capacity to respond to unexpected events. The COVID-19 pandemic has highlighted the need to detect, report, and implement effective response measures to emerging challenges on a local and global scale, and it is crucial that these measures are implemented in a way that supports food production and trade. The use of risk analysis coupled with sound understanding of underlying system dynamics will contribute to resilient and enduring food systems.
... Many investigations of failed projects suggest that what at the time came as complete surprises likely could have been anticipated given due diligence. If extremely rare, highly consequential and almost unpredictable, the term "black swan" can be used (Taleb 2007;Paté-Cornell 2012). For the less extreme cases that in hindsight were reasonably foreseeable, Phoon (2017) uses the term "grey swan". ...
... As a final comment we would particularly like to highlight how, among the risks identified with the proposed design solution, the risk associated with low and variable undrained shear strength of the clay can be treated. By applying the observational method (see further details on this method in Peck 1969;CEN 2004;Fuentes, Pillai, and Ferreira 2018;Spross and Larsson 2019;Spross, Bergman, and Larsson 2021;Powderham and O'Brien 2020), the excavation works can be better monitored and controlled. For example, the excavation can be performed stepwise, starting with the least sensitive part of the pit. ...
Article
Full-text available
Unexpected and unforeseen geotechnical events cause large cost increases in geotechnical engineering projects and threaten construction workers’ health and safety all around the world. Practical tools and guidelines for how to implement structured and effective risk management methods in geotechnical engineering projects are however few and rarely applied. The Swedish Geotechnical Society has therefore published a methodology for this issue. A key activity in this methodology is to create an understanding of and to interpret the geotechnical context in which the project is to be carried out. This paper presents a guide for how practising geotechnical engineers, hydrogeologists, and other related professionals can perform this activity in a structured way. The procedure is illustrated through the foundation design for a new office building in a geotechnically challenging environment. [Available for free download.]
... Thus, the way that the uncertainty has been characterised may be significantly underestimating the risk [7]. There have been several engineering failures that were due in part to underestimating risks in ways similar to this example [7,8]. Before the 1986 Challenger Disaster, NASA management had predicted the probability of failure with loss of vehicle and crew as 1 in 10 5 flights [9]. ...
... For example, if = [2, 3] and = [4,5], then × = [8,15]. However, if it were the case that and were oppositely dependent on each other, such that a low value of is always matched with a high value of , then × is the much narrower interval [10,12]. ...
Preprint
Full-text available
An uncertainty compiler is a tool that automatically translates original computer source code lacking explicit uncertainty analysis into code containing appropriate uncertainty representations and uncertainty propagation algorithms. We have developed an prototype uncertainty compiler along with an associated object-oriented uncertainty language in the form of a stand-alone Python library. It handles the specifications of input uncertainties and inserts calls to intrusive uncertainty quantification algorithms in the library. The uncertainty compiler can apply intrusive uncertainty propagation methods to codes or parts of codes and therefore more comprehensively and flexibly address both epistemic and aleatory uncertainties.
... Of course, nobody can prevent natural hazards, such as earthquakes or tsunamis, due to the underlying randomness of such "perfect storms" (Park et al., 2016;Paté-Cornell, 2012). The public discourse of Fukushima as a natural disaster should, therefore, not have direct implications for people's trust in the operators and regulators of nuclear energy stations. ...
... However, the fact that the nuclear power station was sited in a location with an increased seismic risk placed the focus on the human involvement, instead of natural hazards. Furthermore, focusing the public dialogue on the epistemic uncertainty of "black swans," and on the risks and likelihoods of accidents contributed to a stronger focus on the risks of nuclear energy (Arlt & Wolling, 2016;Park et al., 2016;Paté-Cornell, 2012). The fact that there was a nuclear accident in a highly developed country as Japan, might have reinforced the perception that similar accidents were likely in Germany (i.e., increasing the risk perception of nuclear energy, reducing the trust in regulators and providers). ...
Article
Public trust is being lamented as the central victim of our new, digital information environment, a notion that is depicted in labeling our society as “posttruth” or “posttrust.” Within this article, we aim to call this deficit view of public trust into question and kindle a more positive outlook in future research. For this, we utilize the Social Amplification of Risk Framework to discuss trust as an inherent aspect of social interactions and to question the frameworks’ normative approach to public trust and risk perception. Utilizing a literature review of prior studies that investigated trust within the structure of SARF and a case study on the impacts of Fukushima on public trust in nuclear energy, we would like to argue that the current normative “trust deficit model” should be overcome and future risk research should increasingly focus on the opportunities of the digital informational environment for risk communication.
... Druhým zvoleným konceptem je efekt černé labutě, kdy černá labuť je symbolem původně nemožného, vysoce nepravděpodobného jevu, případně tzv. dokonalé bouře (Paté-Cornell, 2012: 1823. Po tisíciletí lidé na severní polokouli znali jen labutě bílé, přesto před dvěma tisíci lety římský básník Juvenalis použil toto označení jako metaforu nemyslitelného. ...
... Přesto si představitelé mnoha oborů uvědomují, že přílet černé labutě může narušit rovnováhu systémů, u nichž se doposud nestabilita neprojevovala, hlavně ani neočekávala (Adamec-Schüllerová, 2015). Nebo také příměr černé labutě bývá zneužíván k výmluvám politiků či ekonomů, než budou přijata adekvátní opatření k řízení krizového vývoje (Paté-Cornell, 2012). Autorka nemohla být blíž pravdě v souvislosti s pandemickou krizí 2020-2021…Koncept černé labutě je běžně dnes používán ve vztahu k riziku a bezpečnosti (Aven, 2013: 48-49). ...
Article
Full-text available
Since the beginning of the 21st century, the European Union has been undergoing a wave of broad-based crises. The Eurozone crisis, the effects of which are still being felt by the EU economy and the integration process as a whole, the subsequent so-called migration crisis with its impact on the integration dynamic, the British referendum on Brexit, and the pandemic crisis, which may change the integration slowdown. Does this development correspond to the concept of chaos or is it the arrival of a black swan? What security challenges have these crises brought? Macroeconomic vulnerability, geo-economic vulnerability, polarisation of the EU Member States, a decline in the Union unity and a return to national interests. But also, a rethinking of self-sufficiency in strategic zones, questions of strategic autonomy in critical infrastructure. The pursuit of strategic autonomy is likely to require a shift towards majority voting within the EU. The crises of recent years have led to an increase in the EU’s geopolitical emphasis, but also to a reduction in democracy and transparency in democratic systems.
... At the same time, to reduce the complexity and multifariousness of an issue is a risky strategy: it may lead to some important and severe risks not being prepared for, and if they occur, the municipality will have developed no capacity to cope with it. It may also cause the municipality to overlook possible overlapping risks, which can lead to a so-called "perfect storm" [57]. This prompts the question as to whether contemporary risk assessment tools are a proper fit for the creeping crisis of climate change. ...
Article
Full-text available
Creating preparedness for climate change has become an increasingly important task for society. In Sweden, the responsibility for crisis preparedness rests to a large extent on the municipalities. Through an interview study of municipal officials, this paper examines municipalities’ crisis preparedness for climate change and the role they assign to citizens. The theoretical approach is that of risk governance, which adopts an inclusive approach to risk management, and that of risk sociology, which states that how a problem is defined determines how it should be handled and by whom. The empirical results show that the municipal officials mainly discuss technically defined risks, such as certain kinds of climate-related extreme events, the handling of which does not require any substantial involvement of citizens. Citizens’ responsibility is only to be individually prepared, and thereby they do not require municipal resources to protect their own properties in the case of an extreme event. The municipalities, however, feel that their citizens have not developed this individual preparedness and therefore they try to better inform them. This analysis finds five different views of citizens, all with their own problems, and to which the municipalities respond with different communicative measures. By way of conclusion, three crucial aspects are raised regarding the task of making societies better prepared for climate change.
... In contrast, little work has been done on very intense and, in a statistical sense, very rare temperature extremes that occur once in several centuries to millennia, although the probability of historically unprecedented hot extremes increase with anthropogenic global warming (Lewis & King, 2017;Diffenbaugh, et al., 2018;Fischer & Schär, 2010). In the literature, such a low probability-high impact event is also referred to as a perfect storm or grey swan, defining an unseen but predictable event (Paté-Cornell, 2012;Stein & Stein, 2014). In the following, the term very rare heat waves is used to describe low probabilityhigh impact events of very high near-surface air temperature anomalies. ...
Article
Heat waves such as the one in Europe 2003 have severe consequences for the economy, society, and ecosystems. It is unclear whether temperatures could have exceeded these anomalies even without further climate change. Developing storylines and quantifying highest possible temperature levels is challenging given the lack of long homogeneous time series and methodological framework to assess them. Here, we address this challenge by analysing summer temperatures in a nearly 5000-year pre-industrial climate model simulation, performed with the Community Earth System Model CESM1. To assess how anomalous temperatures could get, we compare storylines, generated by three different methods: (1) a return-level estimate, deduced from a generalized extreme value distribution, (2) a regression model, based on dynamic and thermodynamic heat wave drivers, and (3) a novel ensemble boosting method, generating large samples of re-initialized extreme heat waves in the long climate simulation. All methods provide consistent temperature estimates, suggesting that historical exceptional heat waves as in Chicago 1995, Europe 2003 and Russia 2010 could have been substantially exceeded even in the absence of further global warming. These estimated unseen heat waves are caused by the same drivers as moderate observed events, but with more anomalous patterns. Moreover, altered contributions of circulation and soil moisture to temperature anomalies include amplified feedbacks in the surface energy budget. The methodological framework of combining different storyline approaches of heat waves with magnitudes beyond the observational record may ultimately contribute to adaptation and to the stress testing of ecosystems or socio-economic systems to increase resilience to extreme climate stressors.
... When characterizing this process, a crucial differentiation must however be made between "Mass-Attackers" (who focus on high-volume of possibly low-value targets) and "Advanced Persistent Threats" (or APTs for short) generated by highly specialized groups that target specifically few high-value targets. Targeted cyberattacks are characterized by a strong "adversarial" connotation (Rios Insua, Rios, & Banks, 2009) where the attacker can (and has the resources to) perform sophisticated reconnaissance of the target(s), identify suitable 0-day attacks, and tailor the whole attack process against the specific target (Paté-Cornell, 2012). Yet, these attackers only make up for a small fraction of the attack space (Bilge & Dumitras, 2012;Research, 2018). ...
Article
Full-text available
The assumption that a cyberattacker will potentially exploit all present vulnerabilities drives most modern cyber risk management practices and the corresponding security investments. We propose a new attacker model, based on dynamic optimization, where we demonstrate that large, initial, fixed costs of exploit development induce attackers to delay implementation and deployment of exploits of vulnerabilities. The theoretical model predicts that mass attackers will preferably (i) exploit only one vulnerability per software version, (ii) largely include only vulnerabilities requiring low attack complexity, and (iii) be slow at trying to weaponize new vulnerabilities . These predictions are empirically validated on a large data set of observed massed attacks launched against a large collection of information systems. Findings in this article allow cyber risk managers to better concentrate their efforts for vulnerability management, and set a new theoretical and empirical basis for further research defining attacker (offensive) processes.
... Mignan et al. (2014) showed that taking the interaction between several hazards into account leads to the emergence of extremes through clustering of losses (risk migration) and loss amplification at higher losses (risk amplification). Those observations follow the work on extreme events categorized as "perfect storms" and "black swans" or "dragon kings" (Helbing, 2013;Paté-Cornell, 2012;Sachs, Yoder, Turcotte, Rundle, & Malamud, 2012;Sornette, 2009;Taleb & Blyth, 2016). In the work of Mignan et al. (2014), the artificial nature of the exercise and the lack of geographical reference combined with the complexity inherent in multirisk assessment make it difficult to communicate the outcomes to nonspecialists (i.e., emergency services, communities). ...
Article
Full-text available
The impact of natural disasters has been increasing in recent years. Despite the developing international interest in multihazard events, few studies quantify the dynamic interactions that characterize these phenomena. It is argued that without considering the dynamic complexity of natural catastrophes, impact assessments will underestimate risk and misinform emergency management priorities. The ability to generate multihazard scenarios with impacts at a desired level is important for emergency planning and resilience assessment. This article demonstrates a framework for using graph theory and networks to generate and model the complex impacts of multihazard scenarios. First, the combination of maximal hazard footprints and exposed nodes (e.g., infrastructure) is used to create the hazard network. Iterative simulation of the network, defined by actual hazard magnitudes, is then used to provide the overall compounded impact from a sequence of hazards. Outputs of the method are used to study distributional ranges of multihazards impact. The 2016 Kaikōura earthquake is used as a calibrating event to demonstrate that the method can reproduce the same scale of impacts as a real event. The cascading hazards included numerous landslides, allowing us to show that the scenario set generated includes the actual impacts that occurred during the 2016 events.
... Such scenarios will result in increased volatility and uncertainty in the ASC operations. Further, the ASC is becoming more complex with the rise of "black swans" and "perfect storms" which helps to describe the unthinkable or the improbable events (Paté-Cornell, 2012). Such events create ambiguous decisions in ASC. ...
Article
As world is affected by demand volatility; process uncertainty; supply chain complexity and information ambiguity forming a VUCA world. To manage this scenario, industries are adopting emerging technologies for business excellence and one among them is Blockchain. Blockchain technology (BCT) is a distributed ledger technology (DLT) that stores transactional records in a tamper-proof and immutable way; it is a promising solution for incorporating transparency and traceability in traditional ecosystem. As automotive industries are facing a Volatile environment, Uncertain schedules & information; Complex supply chain networks, and Ambiguous decisions that cripples the automotive supply chain (ASC). Therefore, BCT can be used to address issues related to ASC in VUCA world. Keeping this in mind, study reported a systematic literature review (SLR) of BCT applications in ASC. More than seventy research papers were reviewed based on different BCT characteristics and applications. Through content analysis, study explored how to link supply chain visibility, information transparency with BCT for an efficient ASC in VUCA world. Moreover, a BCT implementation framework is proposed for ASC, to provide a decision-making approach for practitioners in VUCA world.
... Everything that is preventive rather than reactive will always be better and therefore less expensive for companies, information is something very delicate in the daily operation at all levels. Therefore, it is necessary to manage different scenarios where even catastrophic events such as black swans are considered by companies [8] or to think about integrating anti-bribery policies as a result of comprehensive risk analysis; in such a way that the reliability, integrity, availability of the information can be met, not repudiation, granting flexibility, reliability and above all digital confidence to both the client and the employer. ...
... single-fault, joint-fault, etc.) to take into account in the expectation and develop a probability model for those scenarios. However, there are a number of difficulties in estimating these probabilities, especially when predicting low-probability unprecedented events-a rate may not be available and any rate taken from previous systems may not apply to the new system, and the designer may not have a good understanding of what hazards may occur [55,225]. Thus, future work needs to investigate means of identifying (automatically from a model or discursively) high-consequence failures (e.g., [150]) and determine how to estimate/specify a probability model when hazard scenario information is sparse. ...
Thesis
It is desirable for complex engineered systems to perform missions efficiently and economically, even when these missions' complex, variable, long-term operational profiles make it likely for hazards to arise. It is thus important to design these systems to be resilient so that they will actively prevent and recover from hazards when they occur. To most effectively design a system to be resilient, the resilience of each design alternative should be quantified and valued so that it can be incorporated in the decision-making process. However, considering resilience in early design is challenging because resilience is a dynamic and stochastic property characterizing how the system performs over time in a set of unlikely-but-salient hazardous scenarios. Quantifying these properties thus requires a model to simulate the system's dynamic behavior and performance over the set of hazardous scenarios. Thus, to be able to incorporate resilience in the design process, there is a need to develop a framework which implements and integrates these models with design exploration and decision-making. This dissertation fulfills this need by defining resilience to enable fault simulations to be incorporated in decision-making, devising and implementing a modelling framework for early assessment of system resilience attributes, and exploring optimization architectures to efficiently structure the design exploration of resilience variables. Additionally, this dissertation provides a validity testing framework to determine when the resilient design process has been effective given the uncertainties present in the design problem. When each of these parts are used together, they comprise an overall framework that can be used to consider and incorporate system resilience in the early design process.
... These events can be explained only in the aftermath and cannot be anticipated, such as the black swan was believed to be impossible before its discovery in the 17 th century (Taleb, 2007). However, the concept may be misused as it may represent a reason for ignoring the potential for major accidents and avoiding the implementation of long-term safety measures (Paté-Cornell, 2012). ...
... Decision making and risk analysis addresses a host of diverse real-world problems (Borgonovo, Cappelli, Maccheroni, & Marinacci, 2018;Howard, 1988;Kontosakos, Hwang, Kallinterakis, & Pantelous, 2020;McGill, Ayyub, & Kaminskiy, 2007;Paté-Cornell, 2012) by traditionally drawing heavily on quantitative methodologies, which rely on interdisciplinary designs via the integration of several disciplines (Aven, 2012;Aven & Zio, 2014). Despite the large number of risk-based decision-making frameworks developed in that respect, it is nevertheless interesting to note that they tend to only rarely incorporate causal arguments in the treatment of decisions, even though real-world practice decisions often depend heavily on causal knowledge and reasoning (Hagmayer & Fernbach, 2017). ...
Article
Full-text available
Either in the form of nature's wrath or a pandemic, catastrophes cause major destructions in societies, thus requiring policy and decisionmakers to take urgent action by evaluating a host of interdependent parameters, and possible scenarios. The primary purpose of this article is to propose a novel risk‐based, decision‐making methodology capable of unveiling causal relationships between pairs of variables. Motivated by the ongoing global emergency of the coronavirus pandemic, the article elaborates on this powerful quantitative framework drawing on data from the United States at the county level aiming at assisting policy and decision makers in taking timely action amid this emergency. This methodology offers a basis for identifying potential scenarios and consequences of the ongoing 2020 pandemic by drawing on weather variables to examine the causal impact of changing weather on the trend of daily coronavirus cases.
... Denn niederländische Seefahrer, die bislang in ihrem Heimatland nur weiße Schwäne gekannt haben, haben auf ihrer Reise nach Australien im 17. Jahrhundert plötzlich schwarze Schwäne beobachten können, was sie aus ihrer bisherigen Beobachtung und Erfahrung zuhause nicht für möglich gehalten hätten (Paté-Cornell, 2012 Technische Entwicklungen, die Fehler verhindern können, wachsen jedoch nicht von selbst aus den Arbeitsbedingungen und Geräten heraus, sondern werden von Menschen erdacht und konstruiert und können selber wieder zu Fehlerquellen werden. Sie erfassen somit nicht alle denkbaren Umstände und Verläufe oder Maßnahmen, sondern die erkannten und für wichtig bewerteten. ...
Article
Full-text available
Everyone makes mistakes, and people should learn from mistakes, they say. If you yourself are affected by errors and their effects, for example as a passenger in an airplane, you would not want to be exposed to errors. In the early years, aviation benefited from psychology in the field of troubleshooting, and has now become the safest means of transport in terms of distance travel. This is also due to the fact that risk and error management, as well as a security culture, which has been developed and implemented in a variety of ways, is proactive, not just reactive, to the sources of errors. In addition to a brief introduction to human factors, the present work also develops a transfer of knowledge back to psychology with reference primarily to psychological diagnostics and assessment.
... Addressing uncertainty is a major subject of ongoing risk advances [24]. This includes how to deal with surprising events (e.g., black swans [13,252]) and deep uncertainty or ambiguity (where reasonable probability estimates are unavailable or the relevance of past data is in doubt [80,296]). These development in risk 98 science are ongoing and will continue to provide insight for addressing uncertainty in complex problems, such as climate change. ...
Thesis
The climate crisis is an unprecedented threat. We urgently need to design our infrastructure, economic, and agricultural systems and our communities to withstand hazards and reduce risk to address this threat. This dissertation contributes by exploring the potential of data-driven urban planning and through increasing our understanding of how risk and data science can be used to build the resilience of our communities. Central to this thesis is the understanding that risk analysis (the assessment, characterization, communication, and management of risk, along with related policy) can enhance urban planning to better mitigate hazards and protect our communities. To improve risk analysis's efficacy for use in urban planning, there are a series of necessary advances to the science of risk (i.e., the knowledge, frameworks, and principles that underlie risk analysis). Each chapter of this dissertation contributes to these advances, including how we focus risk analysis for the betterment of people, how we leverage data science to understand the role of urban form in hazard mitigation, how we incorporate spatiotemporal and behavioral feedbacks into risk analysis, and how we capture resilience within the risk concept. The primary aims of the dissertation were to: 1) Explore the potential for risk science to be used to support urban planning, 2) Advance methods and understanding of spatiotemporal risk analysis, and 3) Propose an operational approach to building the resilience of communities to hazards. The first chapter identifies how urban planning challenges can develop and motivate developments in risk science. I then advance approaches for conducting risk analysis that captures spatiotemporal and behavioral feedbacks using a coupled complex system model in the second chapter. The third chapter uses machine learning and spatial data to explore how urban characteristics are associated with high temperature, that could lead to higher risk. The next section, chapters four and five, focuses risk analysis on people. I propose that the focus of resilience efforts be on the equitable provision of essential services, such as health care, food, and education. Specifically, we can measure how people's access to essential services changes due to a hazard and across demographic groups. The framework I propose can be used by decision makers before, during, and after a hazard to improve the social sustainability and reduce the long-term risk of a community. In the final two chapters I argue that we must explicitly consider the dimension of time in risk analysis and that this means that the pillars of resilience can be addressed within the concept of risk. This understanding, coupled with the other work within this dissertation, means that resilience, and resilience analysis, is well within the purview of modern risk analysis.
... The design height of 5.7m was based on an estimate that the probability that a tsunami in the Fukushima area could be more than 6m high was less than 10 -2 in the next 50 years. However, data was available detailing previous earthquakes, such as the Sanriku earthquake of 869 of magnitude 8.1 which is estimated to have caused a tsunami up to 20m high (Paté-Cornell, 2012). What makes this an unknown known is that it seems that events occurring more than 1000 years ago were ignored and considered outside the scope of the data set used for the risk prediction. ...
... Strategies such as being highly selective about scenarios are clearly difficult in a security domain. Security professionals are encouraged to 'think the unthinkable' and ask 'what if? (NSM, PST, and POD 2015). Many prefer to use the term 'possibility' instead of probability (A1,P3,C4,C8) which draws attention to possible scenarios, not likely ones. ...
Article
Full-text available
Security and ‘securing’ is high on the public agenda. Questions are raised on where, to what degree and against what the government and others should introduce preventive security measures. This article investigates a controversy in Norway about the role of probability in risk assessment within security. The article asks how the question of the probability of incidents is problematized and addressed by actors involved. It discusses how the controversy can be interpreted and what it might tell us about security and risk. The article builds on an exploratory study of the reasoning of security professionals in relation to a standard on security risk assessment. It shows how the downplaying of probability is defended, but also how it creates dilemmas and is criticized. The argument against estimating probability is that it is often difficult or impossible. Probability is, however, also a moderating factor. Probability turns unlikely futures into lower risks than likely futures. Those arguing against the security risk standard point to the consequence of downplaying probability in risk estimates. A key finding is how risk assessment in areas of low tolerance for incidents introduces a discrepancy that is difficult to handle. On the one hand, security analysts are supposed to deal with threats as risks, implying scaling, comparison and level of acceptance. On the other hand, they are supposed to create security, implying the opposite of scaling and risk acceptance. Risk assessment becomes difficult if there is little appetite for taking risk. Michael Power’s three ideal models of risk management logics are introduced in the discussion as heuristic tools of a sensitizing kind. The article concludes that risk research could benefit from engaging with security theory, to investigate how risk management might be shaped by security practises.
... The probability and magnitude of such singularities, or unrepeatable events, cannot be estimated by studying the other floods in the catchment. Their estimation requires assembling evidence about the relevant influencing factors and projecting them through a causal model (Hall & Anderson, 2002), applying, for instance, methods developed in the field of probabilistic risk analysis (Paté-Cornell, 2012). In this review, we propose nine hypotheses on generating mechanisms and discuss to which extent the current knowledge allows to support or falsify each hypothesis. ...
Article
Full-text available
Statistical distributions of flood peak discharge often show heavy tail behavior, that is, extreme floods are more likely to occur than would be predicted by commonly used distributions that have exponential asymptotic behavior. This heavy tail behavior may surprise flood managers and citizens, as human intuition tends to expect light tail behavior, and the heaviness of the tails is very difficult to predict, which may lead to unnecessarily high flood damage. Despite its high importance, the literature on the heavy tail behavior of flood distributions is rather fragmented. In this review, we provide a coherent overview of the processes causing heavy flood tails and the implications for science and practice. Specifically, we propose nine hypotheses on the mechanisms causing heavy tails in flood peak distributions related to processes in the atmosphere, the catchment, and the river system. We then discuss to which extent the current knowledge supports or contradicts these hypotheses. We also discuss the statistical conditions for the emergence of heavy tail behavior based on derived distribution theory and relate them to the hypotheses and flood generation mechanisms. We review the degree to which the heaviness of the tails can be predicted from process knowledge and data. Finally, we recommend further research toward testing the hypotheses and improving the prediction of heavy tails.
... The related topics of risk evaluation, risk acceptability, and economic aspects of safety decision-making (Ale, 2005;Reniers and Van Erp, 2016) also present fruitful avenues for future research. Further issues related to risk assessment concerns how to identify and manage risks related to black swan events (Paté-Cornell, 2012), which are also an important aspect of Sam Mannan's safety triad for process safety (O'Connor et al., 2019). An issue needing more scientific attention is interorganizational accident risk management (Milch and Laumann, 2016), which has received little attention in the wider process safety research community and appears lacking in a Chinese context. ...
Article
This article presents a bibliometric analysis and mapping of the Chinese process safety research, focusing on the contributions made in core process safety journals and on the influences of international collaborations and knowledge sources on the developments of this research domain. Collaboration networks, term co-occurrence networks, and co-citation network were analyzed to identify trends, patterns, and the knowledge distribution of the Chinese research on process safety. Work to data has been clustered mainly on safety of chemical processes, fire and explosion, and risk management and accidents. Chinese research contributions are concentrated in only few journals, while the corresponding intellectual base draws on the wider literature focused on understanding and modeling phenomena, and on the broader risk research literature, although to a lesser extent. While various foreign authors are highly cited by Chinese authors, only very few direct collaborations with international scholars are identified. The results are used as a basis for a discussion on future research directions and developments for the community. Increased focus on uncertainty treatment and handling of black swan events, risk evaluation and economic aspects of safety decisions, interorganizational risk management, road and maritime transport of hazardous substances, risk perception and communication, and integrated safety and security assessment, are highlighted as fruitful directions for future scholarship. It is hoped that the insights obtained from this work can facilitate new and consolidated collaborations, as well as further invigorate the Chinese process safety domain, ultimately contributing to improved safety performance of process industries in China and elsewhere.
... It is common to refer to traditional approaches for risk analysis and management, which proceeds from the premise that the events (hazards, threats) are identifiable (Park, Seager, Rao, Convertino, & Linkov, 2013). However, contemporary risk analysis and risk science also capture unknown types of events and potential surprises (black swans) (Aven, 2020;Aven & Renn, 2018;Paté-Cornell, 2012;Paté-Cornell & Cox 2014;SRA, 2017). ...
Article
Full-text available
“Risk” and “resilience” are both terms with a long history, but how they are related and should be related are strongly debated. This article discusses the appropriateness of a perspective advocated by an active “resilience school” that sees risk as a change in critical system functionality, as a result of an event (disturbance, hazard, threat, accident), but not covering the recovery from the event. From this perspective, two theses are examined: risk and resilience are disjunct concepts, and risk is an aspect of resilience. Through the use of several examples and reasoning, the article shows that this perspective challenges daily‐life uses of the risk term, common practices of risk assessments and risk management, as well as contemporary risk science. A fundamental problem with the perspective is that system recovery is also an important aspect of risk, not only of resilience. Risk and resilience analysis and management implications of the conceptual analysis are also discussed.
... This issue is generally related to surprises relative to knowledge and how they are examined in the analysis (Aven and Zio 2018;Aven and Kristensen 2019). Several studies have tried to classify and discuss methods to address SrK (Aven 2013b(Aven , 2015Paté-Cornell 2012). In this regard, Aven (2015), considers three types of surprizing events (black swans): ...
Article
Full-text available
Epistemic uncertainty in seismic hazard analysis is traditionally addressed by utilizing a logic-tree structure with subjective probabilities for branches. However, many studies have argued that probability is not a suitable choice for addressing epistemic uncertainties; in particular in addressing the background knowledge supporting the probabilities. In this regard, the application of imprecise probability (IP) is investigated. It is discussed that IP could provide a flexible tool for a more objective presentation of experts' knowledge. Moreover, the importance of addressing the strength of knowledge and surprises relative to knowledge in seismic hazard analysis along with methods to do so, are discussed. Then, a method is proposed for providing seismic hazard curves with consideration of IPs for logic-tree branches and addressing the knowledge dimension. It is suggested to consider the worst possible combination of branch probabilities for the expected intensity measures at all return-periods in order to construct the seismic hazard curve. A process is also suggested to demonstrate how the proposed seismic hazard analysis method could be properly used in the seismic design of buildings. The performance of the suggested method was investigated on Uniform California Earthquake Rupture Forecast, Version 2 (UCERF2) logic-tree for two sites in Los Angeles and Oakland, California, US. The results indicate that even with a similar logic-tree, the effects of imprecision in logic-tree weights could be different at different sites.
... The metaphor was first presented by Taleb (2007) and further scrutinized, discussed, and developed in a risk-assessment context by risk science professionals such as, e.g. (Gross, 2010;Paté-Cornell, 2012;Lindley, 2013;Aven, 2014). Aven (2014) describes surprises (black swan events) to appear unexpected in the light of current knowledge/believes and to carry severe consequences. ...
Article
Full-text available
A better understanding of the potential cumulative impacts of large-scale fish farming, could help marine aquaculture to become more environmentally sustainable. Risk assessment plays an important role in this process by elucidating the main challenges and associated risk factors. An appropriate aquaculture risk assessment should contribute to mutual risk understanding and risk acknowledgement among stakeholders, and thus common perspectives on measures and governance. In this paper, we describe an approach to risk assessment in marine aquaculture that aims to promote fruitful discussions about risk and risk-influencing factors across stakeholders with different value perceptions. We elaborate on the concept of risk and risk terminology and conclude that new aquaculture risk assessment methodology should be guided by risk science. The suggested methodology is based on the latest thinking in risk science and has been tested in a thorough study of environmental risk related to Norwegian aquaculture. The study shows that the new methodical approach has an immanent pedagogical potential and contributes to strengthening risk understanding and risk acknowledgement among stakeholders. In conclusion, the suggested risk assessment methodology has proved a valuable tool for marine scientists in analyzing, evaluating, and communicating environmental risk.
... The related topics of risk evaluation, risk acceptability, and economic aspects of safety decision-making (Ale, 2005;Reniers and Van Erp, 2016) also present fruitful avenues for future research. Further issues related to risk assessment concerns how to identify and manage risks related to black swan events (Paté-Cornell, 2012), which are also an important aspect of Sam Mannan's safety triad for process safety (O'Connor et al., 2019). An issue needing more scientific attention is interorganizational accident risk management (Milch and Laumann, 2016), which has received little attention in the wider process safety research community and appears lacking in a Chinese context. ...
Article
Full-text available
This article presents a bibliometric analysis and mapping of the Chinese process safety research, focusing on the contributions made in core process safety journals and on the influences of international collaborations and knowledge sources on the developments of this research domain. Collaboration networks, term co-occurrence networks, and co-citation network were analyzed to identify trends, patterns, and the knowledge distribution of the Chinese research on process safety. Work to data has been clustered mainly on safety of chemical processes, fire and explosion, and risk management and accidents. Chinese research contributions are concentrated in only few journals, while the corresponding intellectual base draws on the wider literature focused on understanding and modeling phenomena, and on the broader risk research literature, although to a lesser extent. While various foreign authors are highly cited by Chinese authors, only very few direct collaborations with international scholars are identified. The results are used as a basis for a discussion on future research directions and developments for the community. Increased focus on uncertainty treatment and handling of black swan events, risk evaluation and economic aspects of safety decisions, interorganizational risk management, road and maritime transport of hazardous substances, risk perception and communication, and integrated safety and security assessment, are highlighted as fruitful directions for future scholarship. It is hoped that the insights obtained from this work can facilitate new and consolidated collaborations, as well as further invigorate the Chinese process safety domain, ultimately contributing to improved safety performance of process industries in China and elsewhere.
Article
Full-text available
Floods in river deltas are driven by complex interactions between astronomical tides, sea levels, storm surges, wind waves, rainfall-runoff, and river discharge. Given the anticipated increase in compound flood hazards in river deltas in a warming climate, climate-informed regional to local extreme water levels (EWL) is thus critical for decision-makers to evaluate flood hazards and take adaptation measures. We develop a simple yet computationally efficient stress-test framework, which combines historical and projected climatological information and a state-of-the-art hydrodynamic model, to assess future compound coastal-fluvial flood hazards in river deltas. Our framework is applied in the world’s largest single urban area, China’s Pearl River Delta (PRD), which is also characterized by densely crossed river network. We find that extreme sea level is the dominant driver causing the compound coastal-fluvial flood in the PRD over the past 60 years. Meanwhile, there is large spatial heterogeneity of the individual and compound effects of the typhoon intensity, local sea-level rise, and riverine inflow on coastal-fluvial floods. In a plausible disruptive scenario (e.g., a 0.50 m sea-level rise combined with a 9% increase in typhoon intensity in a 2°C warming), the EWL will increase by 0.76 m on average. An additional 1.54 m and 0.56 m increase in EWL will occur in the river network and near the river mouth, respectively, if coastal floods coincide with the upstream mean annual flood. Findings from our modeling framework provide important insights to guide adaptation planning in river deltas to withstand future compound floods under climate change.
Chapter
The US Department of Homeland Security (DHS) defines risk as the “potential for an adverse outcome assessed as a function of threats, vulnerabilities, and consequences associated with an incident, event, or occurrence.” National security risk analyses are conducted across a spectrum of threats, such as nuclear terrorism to pandemic diseases. These analyses often rely on historical data and/or data derived from simulation or expert judgment to quantify and model the various elements of risk. This chapter focuses on the evolution of DHS's definition of the elements of security risk, i.e. threat, vulnerability, and consequence. An overview of the modeling methods and approaches typically adopted for characterizing these risk components will also be discussed.
Article
Full-text available
2020 is the year of wildfire records. California experienced its three largest fires early in its fire season. The Pantanal, the largest wetland on the planet, burned over 20% of its surface. More than 18 million hectares of forest and bushland burned during the 2019–2020 fire season in Australia, killing 33 people, destroying nearly 2500 homes, and endangering many endemic species. The direct cost of damages is being counted in dozens of billion dollars, but the indirect costs on water‐related ecosystem services and benefits could be equally expensive, with impacts lasting for decades. In Australia, the extreme precipitation (>200 mm day −1 in several location) that interrupted the catastrophic wildfire season triggered a series of watershed effects from headwaters to areas downstream. The increased runoff and erosion from burned areas disrupted water supplies in several locations. These post‐fire watershed hazards via source water contamination, flash floods, and mudslides can represent substantial, systemic long‐term risks to drinking water production, aquatic life, and socio‐economic activity. Scenarios similar to the recent event in Australia are now predicted to unfold in the Western USA. This is a new reality that societies will have to live with as uncharted fire activity, water crises, and widespread human footprint collide all‐around of the world. Therefore, we advocate for a more proactive approach to wildfire‐watershed risk governance in an effort to advance and protect water security. We also argue that there is no easy solution to reducing this risk and that investments in both green (i.e., natural) and grey (i.e., built) infrastructure will be necessary. Further, we propose strategies to combine modern data analytics with existing tools for use by water and land managers worldwide to leverage several decades worth of data and knowledge on post‐fire hydrology. This article is protected by copyright. All rights reserved.
Article
This paper aims to test complexity science theory adoption by Zimbabwean banks to fulfil the Basel Committee’s demand for a new method and paradigm on emerging risk management. Unlike conventional risks, the absence of standard definition, framework, and paradigm for emerging risks’ management poses challenges in Zimbabwean banks and globally. Data are collected by a structured questionnaire from 120 Risk Managers in 16 banks. Banks are divided into strata by bank size and ownership structure. Data analysis uses descriptive statistics. This paper finds higher levels of complexity science adoption on four�realms of emergence and adaptive properties but lower on modeling methods. The level of adoption and awareness is similar in Pan African, indigenous, and international banks. This paper contributes to the body of knowledge in two specific ways. First it adds insights to debate on finding standard definition for emerging risks. Second it pioneers and validates the adoption of complexity science theory for emerging risk management in Zimbabwean banks.
Article
Full-text available
As climate change increases the frequency and severity of disasters, and population and social changes raise the public’s vulnerability to disaster events, societies face additional risk of multiple disaster events or other hazards occurring simultaneously. Such hazards involve significant uncertainty, which must be translated into concrete plans able to be implemented by disaster workers. Little research has explored how disaster managers incorporate different forms of knowledge and uncertainty into preparations for simultaneous hazards or disaster events, or how front-line disaster workers respond to and implement these plans. In this paper I draw on ethnographic research working as a wildland firefighter, interviews with firefighters and fire managers, and state and agency planning documents to examine preparations for two events occurring in Central Oregon in August 2017: (1) the height of wildfire season and (2) hundreds of thousands of anticipated visitors for a total solar eclipse. I find that different qualities of risk, hazard, and uncertainty across these two events were central to the development and implementation of disaster plans. Agency leaders devised worst-case scenario plans for the eclipse based on uncertain predictions regarding hazards from the eclipse and the occurrence of severe wildfires, aiming to eliminate the potential for unknown hazards. These plans were generally met with skepticism by front-line disaster workers. Despite the uncertainties that dominated eclipse-planning rhetoric, firefighters largely identified risks from the eclipse that were risks they dealt with in their daily work as firefighters. I conclude by discussing implications of these findings for conceptual understandings of disaster planning as well as contemporary concerns about skepticism and conspiracy theories directed at government planning and response to disaster events.
Article
Full-text available
Extreme and impactful weather events of the recent past provide a vital but under-utilised data source for understanding present and future climate risks. Extreme event attribution (EEA) enables us to quantify the influence of anthropogenic climate change (ACC) on a given event in a way that can be tailored to stakeholder needs, thereby enhancing the potential utility of studying past events. Here we set out a framework for systematically recording key details of high-impact events on a national scale (using the UK and Puerto Rico as examples), combining recent advances in event attribution with the risk framework. These ‘inventories’ inherently provide useful information depending on a user’s interest. For example, as a compilation of the impacts of ACC, we find that in the UK since 2000, at least 1500 excess deaths are directly attributable to human-induced climate change, while in Puerto Rico the increased intensity of Hurricane Maria alone led to the deaths of up to 3670 people. We also explore how inventories form a foundation for further analysis, learning from past events. This involves identifying the most damaging hazards and crucially also vulnerabilities and exposure characteristics over time. To build a risk assessment for heat-related mortality in the UK we focus on a vulnerable group, elderly urban populations, and project changes in the hazard and exposure within the same framework. Without improved preparedness, the risk to this group is likely to increase by ∼50% by 2028 and ∼150% by 2043. In addition, the framework allows the exploration of the likelihood of otherwise unprecedented events, or 'Black Swans’. Finally, not only does it aid disaster preparedness and adaptation at local and national scales, such inventories also provide a new source of evidence for global stocktakes on adaptation and loss and damage such as mandated by the Paris Climate Agreement.
Chapter
The competitive environment, so strongly globalised, means firms must operate in conditions of uncertainty and consequently adopt appropriate strategic actions to best exploit competitive advantages. The aim of this chapter is, therefore, to analyse the role of corporate governance in situations of uncertainty and complexity, as well as to illustrate the set of strategies available to the corporate system. More specifically, after analysing the role of strategic and operational management in complex areas, characterised by strong uncertainty, we apply those concepts by focusing on the economic-financial strategies and competitive levers that can be used to manage financial decisions correctly.
Article
Full-text available
Society is concerned about maritime accidents since pollution, such as oil spills from ship accidents, adversely affects the marine environment. Operational and strategic pollution preparedness and response risk management are essential activities to mitigate such adverse impacts. Quantitative risk models and decision support systems (DSS) have been proposed to support these risk management activities. However, there currently is a lack of computationally fast and accurate models to estimate oil spill consequences. While resource-intensive simulation models are available to make accurate predictions, these are slow and cannot easily be integrated into quantitative risk models or DSS. Hence, there is a need to develop solutions to accelerate the computational process. A fast and accurate metamodel is developed in this work to predict damage and oil outflow in tanker collision accidents. To achieve this, multiobjective optimization is applied to three metamodeling approaches: Deep Neural Network, Polynomial Regression, and Gradient Boosting Regression Tree. The data used in these learning algorithms are generated using state-of-the-art engineering models for accidental damage and oil outflow dynamics. The multiobjective optimization approach leads to a computationally efficient and accurate model chosen from a set of optimized models. The results demonstrate the metamodel’s robust capacity to provide accurate and computationally efficient estimates of damage extents and volume of oil outflow. This model can be used in maritime risk analysis contexts, particularly in strategic pollution preparedness and response management. The models can also be linked to operational response DSS when fast, and reasonably accurate estimates of spill sizes are critical.
Article
Psychotherapy as a treatment for sick persons wants to support people to overcome their suffering. In order to be able to achieve this goal effectively and with a high degree of accuracy, guidelines and manuals are increasingly being provided and the actual effectiveness in psychotherapy research is being checked. When a goal in psychotherapy occasionally cannot be achieved, the question of causes arises. A possible explanation is failure. Errors, mistakes and rule violations are others. Distinguishing them can clarify causes and encourage learning from them.
Chapter
Airports have experienced an increasing volatility in air traffic due to more competitive air transport markets and external shock events. This volatility is illustrated for a number of individual airports. The unique impact of the COVID-19 pandemic on air travel is analyzed in terms of deep uncertainty, Taleb's Black and White Swans and Wucker's Gray Rhino. Various management tools intend to integrate the uncertainty of disruptive shock events in strategic airport planning. Specific attention is paid to aviation-related scenario development by scanning the shaping drivers of threats and opportunities in the business environment. Flexible airport master planning and diversification strategies contribute to strategic responsiveness and financial robustness of airports. The unique impact of COVID-19 on the airport business reveals the limits of this anticipatory responsiveness.
Article
Recently, there has been a discussion in the safety science community concerning the validity of basic approaches to safety, referred to as Safety I, Safety II and Safety III, with Erik Hollnagel and Nancy Leveson in leading roles. The present paper seeks to obtain new knowledge concerning the issues raised, by adding a contemporary risk science perspective to the discussion. Risk is, to a limited degree, addressed in the literature presenting and studying these three approaches; however, as argued in the paper, the concept of risk and risk analysis and management principles and methods are highly relevant and useful for understanding the three safety approaches, deciding on their suitability, and eventually applying them. The paper underlines the importance of an integration of the safety and risk sciences, to further enhance concepts, approaches, principles, methods and models for understanding, assessing, communicating and managing system performance.
Book
Full-text available
PREFACE The second edition of ICAFI was dedicated to the reflection on Accounting, Finance and Technologies in Learning Organization. This was the specific challenge of the 2021 conference edition, and different and complementary perspectives on this theme were sought from multidisciplinary fields. Thus, in the second edition of ICAFI 2021, we got dozens of interesting approaches and contributions related to Learning Organization, which is a very dynamic and challenging environment, with an international perspective. We addressed accounting, finance and technology in learning organizations, with inspirational sessions related to artificial intelligence, digital assets and reporting of social and corporate responsibility. The parallel sessions covered topics related to the history of accounting, standardization, taxation, sustainability, management control, performance evaluation, social economy, corporate governance models, among others. Adapting to the demands placed on us by Pandemic Covid 19, this year's ICAFI 2021 took place digitally. However, the digital environment, which seemed like an obstacle before, turned out to be an excellent way to stimulate knowledge sharing and strengthen community spirit. Now and looking to the near future, we will be back in June 30 - July 1, 2022 with the third edition of ICAFI, fully focused on Sustainability Business and Innovation. We will be waiting for you! See our website for more details: http://icafi.pt/. On the following pages you will find the submissions presented at ICAFI 2021. They are sorted alphabetically after the first author's last name.
Article
En este estudio se investiga la percepción del riesgo y el contagio en red en el consumo de viajes compartidos — Uber y Carpooling—. El universo del estudio se relaciona directamente con los mercados impactados por la «Ubereconomía», principalmente, los taxis y los demás medios de transporte urbano. La hipótesis central de este estudio propone que los riesgos se diseminan a partir de una relación interpersonal entre participantes de una comunidad y el hecho de que ese riesgo puede ser propagado por un primer actor. La brecha explorada en este estudio contempla el contagio de la percepción del riesgo en economías de acceso para viajes compartidos y su impacto en el mercado de taxis. Los descubrimientos de la investigación indican que la seguridad y el confort son los principales aspectos del consumo de viajes compartidos. También se indican fallas en el sistema de evaluación de los usuarios y de los prestadores de servicio al realizar la validación del perfil a partir de redes sociales como Facebook y otros. El análisis permite concluir que, dado que la seguridad y la comodidad son atributos clave, cabe desarrollar funciones en las plataformas que tengan como objetivo mejorar los aplicativos, principalmente, en la de evaluación de los involucrados y en los indicadores de mantenimiento de los vehículos. Finalmente se establece la posibilidad extender los hallazgos de esta investigación para los otros negocios realizados en plataformas digitales de economía compartida.
Chapter
This chapter focuses on cyber-insurance for Cyber-Physical Systems (CPS). We identify two types of networked systems: traditional Information Technology Systems (ITS) and CPS. They differ with respect to cyber-security. Security challenges of CPSs are driven by their particulars (physical features and capabilities, regulations, and attacker motivations). The factors complicating the advancement of CPS cyber-insurance ecosystem, including: extreme information scarcity; risk assessment problems, exacerbated by the growing complexity of CPS and the intricacies of risk prorogation. We conclude that without new government policies improving security information, cyber-insurance market for CPS may stall.
Chapter
In this chapter, we define cyber-risks, summarize their properties, and discuss the tools of cyber-risk management. We provide comparative analysis of cyber-risks and other (natural disasters, terrorism, etc.) risks. Importantly, we divide networked systems into two domains: traditional Information Technology Systems (ITSs) and Cyber-Physical Systems (CPSs), and compare and contrast their cyber-risks. We demonstrate that these domains have distinct cyber-risk features and bottlenecks of risk management. We suggest that this dichotomy is a useful tool for cyber-risk analysis and management. In the companion chapter on Cyber-insurance, we apply this classification to simplify the exposition of Cyber-insurance for CPS.
Article
After the 2011 Great East Japan disaster, residents of Fukushima were inundated with media photographs that painted a dire picture. As emotionally triggering photographs have been established as a potential barrier to recovery from trauma, there is a need to better understand their impact on the socio-psychological recovery of disaster survivors. Drawing from media system dependency theory and cognitive neuroscience, the affective circumplex model and an adaptive photo-elicitation interview technique offer unique understandings of affective responses to photographs. Results indicate that although impactful media photographs can act as recurring stimuli to the experienced disaster, over time they can also interrupt negative thought processes and encourage post-traumatic growth.
Chapter
This chapter introduces the concept of environmental risk assessment. The background and risk analysis methods were comprehensively discussed. Various qualitative and quantitative methods for risk assessment are presented along with data sources, technical guidance documents, and software tools to conduct ecological risk assessment. Examples of risk assessment are presented along with perspectives.
Article
Full-text available
This work investigates aspects of the global sensitivity analysis of computer codes when alternative plausible distributions for the model inputs are available to the analyst. Analysts may decide to explore results under each distribution or to aggregate the distributions, assigning, for instance, a mixture. In the first case, we lose uniqueness of the sensitivity measures, and in the second case, we lose independence even if the model inputs are independent under each of the assigned distributions. Removing the unique distribution assumption impacts the mathematical properties at the basis of variance‐based sensitivity analysis and has consequences on result interpretation as well. We analyze in detail the technical aspects. From this investigation, we derive corresponding recommendations for the risk analyst. We show that an approach based on the generalized functional ANOVA expansion remains theoretically grounded in the presence of a mixture distribution. Numerically, we base the construction of the generalized function ANOVA effects on the diffeomorphic modulation under observable response preserving homotopy regression. Our application addresses the calculation of variance‐based sensitivity measures for the well‐known Nordhaus' DICE model, when its inputs are assigned a mixture distribution. A discussion of implications for the risk analyst and future research perspectives closes the work.
Chapter
The alarming spread of the COVID-19 pandemic has made the entire world anxious. It has not only created a global health crisis but has also resulted in a poorly performing global economy. The entire world is investing their research and development team into inventing a vaccine and the extensive process of vaccine development seems to be taking a lifetime. Nevertheless, these tough times have transformed the functioning of business activities and firms including work from home across the world, operation with minimal staff in a firm’s location, etc. Therefore, this chapter focuses on demonstrating the scenario of post-COVID supply chains. In this context, we must focus on sustainable and efficient supply chains to lead the business from the front. Here, our aim is to discuss the challenges in supply chains in the post-COVID era, the role of digital technologies, the essence of risk assessment, and industry 4.0 supply chains. Digital technologies such as analytics, blockchain, artificial intelligence, big data, cloud computing, and quantum computing hold the key to ensure future supply chain security, safety, sustainability, visibility, and transparency. Also, we have discussed VUCAness and its essence, and risk assessment frameworks.
Article
Probabilistic Risk Analysis (PRA) has been commonly used by NASA and the nuclear power industry to assess risk since the 1970s. However, PRA is not commonly used to assess risk in networked infrastructure systems such as water, sewer and power systems. Other methods which utilise network models of infrastructure such as random and targeted attack failure analysis, N-k analysis and statistical learning theory are instead used to analyse system performance when a disruption occurs. Such methods have the advantage of being simpler to implement than PRA. This paper explores the feasibility of a full PRA of infrastructure, that is one that analyses all possible scenarios as well as the associated likelihoods and consequences. Such analysis is resource intensive and quickly becomes complex for even small systems. Comparing the previously mentioned more commonly used methods to PRA provides insight into how current practises can be improved, bringing the results closer to those that would be presented from PRA. Although a full PRA of infrastructure systems may not be feasible, PRA should not be discarded. Instead, analysis of such systems should be carried out using the framework of PRA to include vital elements such as scenario likelihood analysis which are often overlooked.
Article
Full-text available
Engineering risk analysis methods, based on systems analysis and probability, are generally designed for cases in which sufficient failure statistics are unavailable. These methods can be applied not only to engineered systems that fail (e.g., new spacecraft or medical devices), but also to systems characterized by performance scenarios including malfunctions or threats. I describe some of the challenges in the use of risk analysis tools, mainly in problem formulation, when technical, human and organizational factors need to be integrated. This discussion is illustrated by four cases: ship grounding due to loss of propulsion, space shuttle loss caused by tile failure, patient risks in anesthesia, and the risks of terrorist attacks on the US. I show how the analytical challenges can be met by the choice of modeling tools and the search for relevant information, including not only statistics but also a deep understanding of how the system works and can fail, and how failures can be anticipated and prevented.
Article
Full-text available
This monograph is written for the numerate nonspecialist, and hopes to serve three purposes. First it gathers mathematical material from diverse but related fields of order statistics, records, extreme value theory, majorization, regular variation and subexponentiality. All of these are relevant for understanding fat tails, but they are not, to our knowledge, brought together in a single source for the target readership. Proofs that give insight are included, but for fussy calculations the reader is referred to the excellent sources referenced in the text. Multivariate extremes are not treated. Second, it presents a new measure of obesity. The most popular definitions in terms of regular variation and subexponentiality invoke putative properties that hold at infinity, and this complicates any empirical estimate. Each definition captures some but not all of the intuitions associated with tail heaviness. problems; they are really out there and they seriously challenge our usual ways of thinking about historical averages, outliers, trends, regression coefficients and confidence bounds among many other things. Data on flood insurance claims, crop loss claims, hospital discharge bills, precipitation and damages and fatalities from natural catastrophes drive this point home.
Chapter
This chapter overviews PRAs over three different levels. The PRA has been used most intensively in the nuclear field. The process industry is another intensive user of the PRA. Whenever there is a need for risk quantification, simpler versions of PRA are used in other fields. Risk quantification without the PRA is imperfect and in a very near future any industry with risks will use more and more complete versions of the PRA.
Article
In this paper, we review methods for assessing and managing the risk of extreme events, where “extreme events” are defined to be rare, severe, and outside the normal range of experience of the system in question. First, we discuss several systematic approaches for identifying possible extreme events. We then discuss some issues related to risk assessment of extreme events, including what type of output is needed (e.g., a single probability vs. a probability distribution), and alternatives to the probabilistic approach. Next, we present a number of probabilistic methods. These include: guidelines for eliciting informative probability distributions from experts; maximum entropy distributions; extreme value theory; other approaches for constructing prior distributions (such as reference or noninformative priors); the use of modeling and decomposition to estimate the probability (or distribution) of interest; and bounding methods. Finally, we briefly discuss several approaches for managing the risk of extreme events, and conclude with recommendations and directions for future research.
Article
The paper describes a study conducted to develop probability-based load factors and load combinations suitable for use with construction materials and technologies. The checking equation format for the proposed load criteria was selected, and the load factors and load combinations were computed using a constrained optimization procedure. Comparisons of reliabilities obtained using the proposed procedure with existing criteria are made. Guidance is provided for material specification-writing.
Article
Risk Analysis in Engineering and Economics is required reading for decision making under conditions of uncertainty. The authordescribes the fundamental concepts, techniques, and applications of the subject in a style tailored to meet the needs of students and practitioners of engineering, science, economics, and finance. Drawing on his extensive experience in uncertainty and risk modeling and analysis, the author covers everything from basic theory and key computational algorithms to data needs, sources, and collection. He emphasizes practical use of the methods presented and carefully examines the limitations, advantages, and disadvantages of each to help readers translate the discussed techniques into real-world solutions. This Second Edition: • Introduces the topic of risk finance • Incorporates homeland security applications throughout • Offers additional material on predictive risk management • Includes a wealth of new and updated end-of-chapter problems • Delivers a complementary mix of theoretical background and risk methods • Brings together engineering and economics on balanced terms to enable appropriate decision making • Presents performance segregation and aggregation within a risk framework • Contains contemporary case studies, such as protecting hurricane-prone regions and critical infrastructure • Provides 320+ tables and figures, over 110 diverse examples, numerous end-of-book references, and a bibliography Unlike the classical books on reliability and risk management, Risk Analysis in Engineering and Economics, Second Edition relates underlying concepts to everyday applications, ensuring solid understanding and use of the methods of risk analysis.
Article
The 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) maintains a separation between stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty, with stochastic uncertainty arising from the possible disruptions that could occur at the WIPP over the 10,000-yr regulatory period specified by the US Environmental Protection Agency (40 CFR 191, 40 CFR 194) and subjective uncertainty arising from an inability to uniquely characterize many of the inputs required in the 1996 WIPP PA. The characterization of stochastic uncertainty is discussed, including drilling intrusion time, drilling location, penetration of excavated/nonexcavated areas of the repository, penetration of pressurized brine beneath the repository, borehole plugging patterns, activity level of waste, and occurrence of potash mining. Additional topics discussed include sampling procedures, generation of individual 10,000-yr futures for the WIPP, construction of complementary cumulative distribution functions (CCDFs), mechanistic calculations carried out to support CCDF construction, the Kaplan/Ganick ordered triple representation fur risk, and determination of scenarios and scenario probabilities. Published by Elsevier Science Ltd.
Article
The 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) maintains a separation between stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty, with stochastic uncertainty arising from the possible disruptions that could occur at the WIPP over the 10,000 yr regulatory period specified by the US Environmental Protection Agency (40 CFR 191, 40 CFR 194) and subjective uncertainty arising from an inability to uniquely characterize many of the inputs required in the 1996 WIPP PA. The characterization of subjective uncertainty is discussed, including assignment of distributions, uncertain variables selected for inclusion in analysis, correlation control, sample size, statistical confidence on mean complementary cumulative distribution functions, generation of Latin hypercube samples, sensitivity analysis techniques, and scenarios involving stochastic and subjective uncertainty.
Conference Paper
This paper presents an analysis of accident precursors or ‘near misses’ (e.g. the space shuttle flights with damaged O-rings prior to the Challenger disaster). The interpretation of such events is often problematic, since the ambiguous nature of the evidence in such cases makes them subject to widely divergent interpretations. Simple Bayesian analysis provides a resolution of this problem, showing that an accident precursor almost always justifies increased rather than decreased estimates of overall accident frequencies.
Article
Complete probabilistic seismic hazard analyses incorporate epistemic uncertainties in assumptions, models, and parameters, and lead to a distribution of annual frequency of exceedance versus ground motion amplitude (the “seismic hazard”). For decision making, if a single representation of the seismic hazard is required, it is always preferable to use the mean of this distribution, rather than some other representation, such as a particular fractile. Use of the mean is consistent with modern interpretations of probability and with precedents of safety goals and cost-benefit analysis.
Article
This paper continues a study of event ambiguity as a primitive concept. Axioms are described for a comparative ambiguity relation on an arbitrary event set that are necessary and sufficient for a representation of the relation by a functional that is nonnegative, vanishes at the empty event, and satisfies complementary equality and submodularity. Uniqueness characteristics of representing functionals are discussed. The theory is extended to multifactor events, where marginal ambiguity and additive representations arise.
Article
ABSTRACT This paper introduces a method for the evaluation of the seismic risk at the site of an engineering project. The results are in terms of a ground,motion parameter (such as peak acceleration) versus average,return period. The method,incorporates the influence of all potential sources of earthquakes and the average activity rates assigned to them. Arbitrary geographical relationships between,the site and po- tential point, line, or areal sources can be modeled with computational ease. In the range of interest, the derived distributions of maximum annual ground motions are in the form of Type I or Type II extreme value distributions, if the more com- monly assumed magnitude,distribution and attenuation laws are used.
Article
A quantitative definition of risk is suggested in terms of the idea of a “set of triplets”. The definition is extended to include uncertainty and completeness, and the use of Bayes' theorem is described in this connection. The definition is used to discuss the notions of “relative risk”, “relativity of risk”, and “acceptability of risk”.
Article
Considerable attention has recently been given to general equilibrium models of the pricing of capital assets. Of these, perhaps the best known is the mean-variance formulation originally developed by Sharpe (1964) and Treynor (1961), and extended and clarified by Lintner (1965a; 1965b), Mossin (1966), Fama (1968a; 1968b), and Long (1972). In addition Treynor (1965), Sharpe (1966), and Jensen (1968; 1969) have developed portfolio evaluation models which are either based on this asset pricing model or bear a close relation to it. In the development of the asset pricing model it is assumed that (1) all investors are single period risk-averse utility of terminal wealth maximizers and can choose among portfolios solely on the basis of mean and variance, (2) there are no taxes or transactions costs, (3) all investors have homogeneous views regarding the parameters of the joint probability distribution of all security returns, and (4) all investors can borrow and lend at a given riskless rate of interest. The main result of the model is a statement of the relation between the expected risk premiums on individual assets and their "systematic risk." Our main purpose is to present some additional tests of this asset pricing model which avoid some of the problems of earlier studies and which, we believe, provide additional insights into the nature of the structure of security returns. The evidence presented in Section II indicates the expected excess return on an asset is not strictly proportional to its B, and we believe that this evidence, coupled with that given in Section IV, is sufficiently strong to warrant rejection of the traditional form of the model given by (1). We then show in Section III how the cross-sectional tests are subject to measurement error bias, provide a solution to this problem through grouping procedures, and show how cross-sectional methods are relevant to testing the expanded two-factor form of the model. We show in Section IV that the mean of the beta factor has had a positive trend over the period 1931-65 and was on the order of 1.0 to 1.3% per month in the two sample intervals we examined in the period 1948-65. This seems to have been significantly different from the average risk-free rate and indeed is roughly the same size as the average market return of 1.3 and 1.2% per month over the two sample intervals in this period. This evidence seems to be sufficiently strong enough to warrant rejection of the traditional form of the model given by (1). In addition, the standard deviation of the beta factor over these two sample intervals was 2.0 and 2.2% per month, as compared with the standard deviation of the market factor of 3.6 and 3.8% per month. Thus the beta factor seems to be an important determinant of security returns.
Article
Monotone measures and Choquet capacities are introduced as a framework for formalizing imprecise probabilities. Arguments for using imprecise probabilities are presented and five representations of imprecise probabilities are introduced: lower probability functions, upper probability functions, close convex sets of probability distributions, Möbius representations, and interactive representations. It is also shown that the classical notion of expected value can be generalized via the Choquet integral. The emphasis of this chapter is on the various unifying features of imprecise probabilities.
Article
Centre of Location. That abscissa of a frequency curve for which the sampling errors of optimum location are uncorrelated with those of optimum scaling. (9.)
Book
"This is the classic work upon which modern-day game theory is based. What began more than sixty years ago as a modest proposal that a mathematician and an economist write a short paper together blossomed, in 1944, when Princeton University Press published Theory of Games and Economic Behavior. In it, John von Neumann and Oskar Morgenstern conceived a groundbreaking mathematical theory of economic and social organization, based on a theory of games of strategy. Not only would this revolutionize economics, but the entirely new field of scientific inquiry it yielded--game theory--has since been widely used to analyze a host of real-world phenomena from arms races to optimal policy choices of presidential candidates, from vaccination policy to major league baseball salary negotiations. And it is today established throughout both the social sciences and a wide range of other sciences. This sixtieth anniversary edition includes not only the original text but also an introduction by Harold Kuhn, an afterword by Ariel Rubinstein, and reviews and articles on the book that appeared at the time of its original publication in the New York Times, tthe American Economic Review, and a variety of other publications. Together, these writings provide readers a matchless opportunity to more fully appreciate a work whose influence will yet resound for generations to come.
Article
The first edition of 1928 is here extended and brought up to date. This book is for non-mathematicians, avoiding the use of formulas and the discussion of problems not readily amenable to non-mathematical treatment. The author aims to offer only a systematic description of certain classes of natural phenomena in the manner of the exact sciences, repudiating all "empty phrases" of metaphysics, and avoiding the error of exaggerated rationalism by restricting the application of "probability." His notion of the "collective" (a sequence satisfying certain conditions of randomness) is fundamental in this theory. The six lectures are: (I) the definition of probability, (II) the elements of the theory of probability, (III) critical discussion of the foundations of the new theory of probability, (IV) the laws of large numbers, (V) applications in statistics and the theory of errors, (VI) problems of statistical physics. (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
I. Are there uncertainties that are not risks? 643. — II. Uncertainties that are not risks, 647. — III. Why are some uncertainties not risks? — 656.
Article
We present in this article the findings from a study on insolvency in the property–casualty insurance industry that was commissioned by the Risk Foundation. The Risk Foundation contacted us for this work to draw from our experience in risk analysis based on systems analysis and probability. Therefore, we provide a different perspective on failure in the insurance industry by opening the “black box” to assess the contribution of different factors to the overall risk. Besides the development of a quantitative model for insolvency risk, our study for the Risk Foundation included insights from (1) unstructured interviews with 15 insurance industry experts and with six insurance regulators in different states, and (2) a statistical analysis of insolvency data (A.M. Best) covering the 1970 through 2005 period. Our focus here is centered on the practical insights that came out of the study, rather than on the technical details that led us to those insights.
Article
A methodology to quantify the risk from fires in nuclear power plants is described. This methodology combines engineering judgment, statistical evidence, fire phenomenology, and plant system analysis. It can be divided into two major parts: (1) fire scenario identification and quantification, and (2) analysis of the impact on plant safety. This article primarily concentrates on the first part. Statistical analysis of fire occurrence data is used to establish the likelihood of ignition. The temporal behaviors of the two competing phenomena, fire propagation and fire detection and suppression, are studied and their characteristic times are compared. Severity measures are used to further specialize the frequency of the fire scenario. The methodology is applied to a switchgear room of a nuclear power plant.
Article
During the period from 1977 to 1984, Pickard, Lowe and Garrick, Inc., had the lead in preparing several full scope probabilistic risk assessments for electric utilities. Five of those studies are discussed from the point of view of advancements and lessons learned. The objective and trend of these studies is toward utilization of the risk models by the plant owners as risk management tools. Advancements that have been made are in presentation and documentation of the PRAs, generation of more understandable plant level information, and improvements in methodology to facilitate technology transfer. Specific areas of advancement are in the treatment of such issues as dependent failures, human interaction, and the uncertainty in the source term. Lessons learned cover a wide spectrum and include the importance of plant specific models for meaningful risk management, the role of external events in risk, the sensitivity of contributors to choice of risk index, and the very important finding that the public risk is extremely small. The future direction of PRA is to establish less dependence on experts for in-plant application. Computerizing the PRAs such that they can be accessed on line and interactively is the key.
Article
In this paper, we review methods for assessing and managing the risk of extreme events, where “extreme events” are defined to be rare, severe, and outside the normal range of experience of the system in question. First, we discuss several systematic approaches for identifying possible extreme events. We then discuss some issues related to risk assessment of extreme events, including what type of output is needed (e.g., a single probability vs. a probability distribution), and alternatives to the probabilistic approach. Next, we present a number of probabilistic methods. These include : guidelines for eliciting informative probability distributions from experts; maximum entropy distributions; extreme value theory; other approaches for constructing prior distributions (such as reference or noninformative priors); the use of modeling and decomposition to estimate the probability (or distribution) of interest; and bounding methods. Finally, we briefly discuss several approaches for managing the risk of extreme events, and conclude with recommendations and directions for future research.
Article
One of the most perplexing problems in risk analysis is why some relatively minor risks or risk events, as assessed by technical experts, often elicit strong public concerns and result in substantial impacts upon society and economy. This article sets forth a conceptual framework that seeks to link systematically the technical assessment of risk with psychological, sociological, and cultural perspectives of risk perception and risk-related behavior. The main thesis is that hazards interact with psychological, social, institutional, and cultural processes in ways that may amplify or attenuate public responses to the risk or risk event. A structural description of the social amplification of risk is now possible. Amplification occurs at two stages: in the transfer of information about the risk, and in the response mechanisms of society. Signals about risk are processed by individual and social amplification stations, including the scientist who communicates the risk assessment, the news media, cultural groups, interpersonal networks, and others. Key steps of amplifications can be identified at each stage. The amplified risk leads to behavioral responses, which, in turn, result in secondary impacts. Models are presented that portray the elements and linkages in the proposed conceptual framework.
Article
The risk of death or brain damage to anesthesia patients is relatively low, particularly for healthy patients in modern hospitals. When an accident does occur, its cause is usually an error made by the anesthesiologist, either in triggering the accident sequence, or failing to take timely corrective measures. This paper presents a pilot study which explores the feasibility of extending probabilistic risk analysis (PRA) of anesthesia accidents to assess the effects of human and management components on the patient risk. We develop first a classic PRA model for the patient risk per operation. We then link the probabilities of the different accident types to their root causes using a probabilistic analysis of the performance shaping factors. These factors are described here as the “state of the anesthesiologist” characterized both in terms of alertness and competence. We then analyze the effects of different management factors that affect the state of the anesthesiologist and we compute the risk reduction benefits of several risk management policies. Our data sources include the published version of the Australian Incident Monitoring Study as well as expert opinions. We conclude that patient risk could be reduced substantially by closer supervision of residents, the use of anesthesia simulators both in training and for periodic recertification, and regular medical examinations for all anesthesiologists.
Article
In this paper, we present a pilot study in which we use probabilistic risk analysis (PRA) to assess patient risk in anesthesia and its human factor component. We then identify and evaluate the benefits of several risk reduction policies. We focus on healthy patients, in modern hospitals, and on cases where the anesthetist is a trained medical doctor. When an accident occurs for such patients, it is often because an error was made by the anesthesiologist, either triggering the event that initiated the accident sequence, or failing to take timely corrective measures. We present first a dynamic PRA model of anesthesia accidents. Our data include published results of the Australian Incident Monitoring Study as well as expert opinions. We link the probabilities of the different types of accidents to the state of the anesthesiologist characterized both in terms of alertness and competence. We consider different management factors that affect the state of the anesthesiologist, we identify several risk reduction policies, and we compute the corresponding risk reduction benefits based on the PRA model. We conclude that periodic recertification of all anesthesiologists, the use of anesthesia simulators in training, and closer supervision of residents could reduce substantially the patient risk.
Article
This paper examines preferences among uncertain prospects when the decision maker is uneasy about his assignment of subjective probabilities. It proposes a two-stage lottery framework for the analysis of such prospects, where the first stage represents an assessment of the vagueness (ambiguity) in defining the problem's randomness and the second stage represents an assessment of the problem for each hypothesized randomness condition. Standard axioms of rationality are prescribed for each stage, including weak ordering, continuity, and strong independence. The Reduction of Compound Lotteries' axiom is weakened, however, so that the two lottery stages have consistent, but not collapsible, preference structures. The paper derives a representation theorem from the primitive preference axioms, and the theorem asserts that preference-consistent decisions are made as if the decision maker is maximizing a modified expected utility functional. This representation and its implications are compared to alternative decision models. Criteria for assigning the relative empirical power of the alternative models are suggested.
Article
The thermal protection system of the space shuttle is one of its most critical subsystems because it protects the orbiter from heavy heat loads at reentry into the atmosphere. To optimize NASA's allocation of risk management resources, a probabilistic risk analysis model is developed for the black tiles, and a risk-criticality index is computed for each tile based on its contribution to the overall probability of loss of vehicle and crew (LOV/C). This assessment is based on the susceptibility of the tiles (i.e. their probabilities of debonding), and on the vulnerability of the orbiter to specific tile losses given the criticality of the subsystems under the aluminum skin in various locations. The two main initiating events are linked to the debonding of a tile, caused either by debris hits or by a weak bond because of poor tile installation. The PRA model relies on a partition of the orbiter's surface according to four parameters: the probability of debris hits, the probability of secondary tile loss once a first tile has debonded, the probability of burnthrough given a failure patch of specified size, and the probability of LOV given a hole in the orbiter's aluminum skin. The results show that the contribution of the tiles to the overall probability of LOV is about 10%. They also include a map of the orbiter's surface showing the relative risk-criticality of tiles at various locations. It was found that 85% of the risk can be attributed to 15% of the tiles, thus allowing the management to allocate more effort and resources to the maintenance of the most risk-critical tiles.