Risk Analysis

Published by Wiley
Online ISSN: 1539-6924
Publications
Article
Short communication.
 
Article
This article proposes a conceptual framework for ranking the relative gravity of diverse risks. This framework identifies the moral considerations that should inform the evaluation and comparison of diverse risks. A common definition of risk includes two dimensions: the probability of occurrence and the associated consequences of a set of hazardous scenarios. This article first expands this definition to include a third dimension: the source of a risk. The source of a risk refers to the agents involved in the creation or maintenance of a risk and captures a central moral concern about risks. Then, a scale of risk is proposed to categorize risks along a multidimensional ranking, based on a comparative evaluation of the consequences, probability, and source of a given risk. A risk is ranked higher on the scale the larger the consequences, the greater the probability, and the more morally culpable the source. The information from the proposed comparative evaluation of risks can inform the selection of priorities for risk mitigation.
 
Article
The last few decades have seen increasingly widespread use of risk assessment and management techniques as aids in making complex decisions. However, despite the progress that has been made in risk science, there still remain numerous examples of risk-based decisions and conclusions that have caused great controversy. In particular, there is a great deal of debate surrounding risk assessment: the role of values and ethics and other extra-scientific factors, the efficacy of quantitative versus qualitative analysis, and the role of uncertainty and incomplete information. Many of the epistemological and methodological issues confronting risk assessment have been explored in general systems theory, where techniques exist to manage such issues. However, the use of systems theory and systems analysis tools is still not widespread in risk management. This article builds on the Alachlor risk assessment case study of Brunk, Haworth, and Lee to present a systems-based view of the risk assessment process. The details of the case study are reviewed and the authors' original conclusions regarding the effects of extra-scientific factors on risk assessment are discussed. Concepts from systems theory are introduced to provide a mechanism with which to illustrate these extra-scientific effects The role of a systems study within a risk assessment is explained, resulting in an improved view of the problem formulation process The consequences regarding the definition of risk and its role in decision making are then explored.
 
Article
There has been considerable scientific effort to understand the potential link between exposures to power-frequency electric and magnetic fields (EMF) and the occurrence of cancer and other diseases. The combination of widespread exposures, established biological effects from acute, high-level exposures, and the possibility of leukemia in children from low-level, chronic exposures has made it both necessary and difficult to develop consistent public health policies. In this article we review the basis of both numeric standards and precautionary-based approaches. While we believe that policies regarding EMF should indeed be precautionary, this does not require or imply adoption of numeric exposure standards. We argue that cutpoints from epidemiologic studies, which are arbitrarily chosen, should not be used as the basis for making exposure limits due to a number of uncertainties. Establishment of arbitrary numeric exposure limits undermines the value of both the science-based numeric EMF exposure standards for acute exposures and precautionary approaches. The World Health Organization's draft Precautionary Framework provides guidance for establishing appropriate public health policies for power-frequency EMF.
 
Article
The Texas Commission on Environmental Quality (TCEQ) has developed an inhalation unit risk factor (URF) for 1,3-butadiene based on leukemia mortality in an updated epidemiological study on styrene-butadiene rubber production workers conducted by researchers at the University of Alabama at Birmingham. Exposure estimates were updated and an exposure estimate validation study as well as dose-response modeling were conducted by these researchers. This information was not available to the U.S. Environmental Protection Agency when it prepared its health assessment of 1,3-butadiene in 2002. An extensive analysis conducted by TCEQ discusses dose-response modeling, estimating risk for the general population from occupational workers, estimating risk for potentially sensitive subpopulations, effect of occupational exposure estimation error, and use of mortality rates to predict incidence. The URF is 5.0 x 10(-7) per microg/m(3) or 1.1 x 10(-6) per ppb and is based on a Cox regression dose-response model using restricted continuous data with age as a covariate, and a linear low-dose extrapolation default approach using the 95% lower confidence limit as the point of departure. Age-dependent adjustment factors were applied to account for possible increased susceptibility for early life exposure. The air concentration at 1 in 100,000 excess leukemia mortality, the no-significant-risk level, is 20 microg/m(3) (9.1 ppb), which is slightly lower than the TCEQ chronic reference value of 33 microg/m(3) (15 ppb) protective of ovarian atrophy. These values will be used to evaluate ambient air monitoring data so the general public is protected against adverse health effects from chronic exposure to 1,3-butadiene.
 
Article
Many systems analysts will be surprised to encounter a program which is billed as a fault-tree development program "perform[ing] logical mathematical operations," but which does not perform Boolean reduction. The rather careful wording quoted in the introduction to this review can, in retrospect, be taken to refer to the fact that gates are quantified using formulas from the calculus of probabilities, and not to claim that Boolean reduction is performed. Since this program does not perform Boolean reduction, its use is limited to essentially graphical applications of the type illustrated in Fig. 1. For this limited application, the program has some features which make it attractive; it is easy to develop and print a passable drawing of a fault tree, and it is easy to do "what-if" analyses (looking at the effects of changing connections or statistics). However, for fault-tree analyses of even moderate complexity, a Boolean processor is necessary (a large fault tree for a real problem in which no events are repeated is arguably a pathological case). Many such algorithms exist on DOS machines, and most of them run within (and are limited to) the usual 640k memory limitation. To be fair, it has to be noted that some commercial algorithms of this type cost far, far more than FaultrEASE (their costs are measured in thousands of dollars rather than hundreds).(ABSTRACT TRUNCATED AT 250 WORDS)
 
Article
September 11 created a natural experiment that enables us to track the psychological effects of a large-scale terror event over time. The archival data came from 8,070 participants of 10 ABC and CBS News polls collected from September 2001 until September 2006. Six questions investigated emotional, behavioral, and cognitive responses to the events of September 11 over a five-year period. We found that heightened responses after September 11 dissipated and reached a plateau at various points in time over a five-year period. We also found that emotional, cognitive, and behavioral reactions were moderated by age, sex, political affiliation, and proximity to the attack. Both emotional and behavioral responses returned to a normal state after one year, whereas cognitively-based perceptions of risk were still diminishing as late as September 2006. These results provide insight into how individuals will perceive and respond to future similar attacks.
 
Article
The Rice-MD Anderson group uses a two-stage clonal expansion (TSCE) model of lung cancer mortality calibrated to a combination of MD Anderson case-control data on smoking histories and lung cancer mortality/incidence rate data collected from prospective cohorts in order to predict risk of lung cancer. This model is used to simulate lung cancer mortality in the U.S. population under the three scenarios of CISNET lung group's smoking base case project in order to estimate the effect of tobacco control policy on lung cancer mortality rates. Simulation results show that tobacco control policies have achieved 35% of the reduction in lung cancer mortality that would have resulted from cessation of all smoking in 1965.
 
Article
Public and political opposition have made finding locations for new nuclear power plants, waste management, and nuclear research and development facilities a challenge for the U.S. government and the nuclear industry. U.S. government-owned properties that already have nuclear-related activities and commercial nuclear power generating stations are logical locations. Several studies and utility applications to the Nuclear Regulatory Commission suggest that concentrating locations at major plants (CLAMP) has become an implicit siting policy. We surveyed 2,101 people who lived within 50 miles of 11 existing major nuclear sites and 600 who lived elsewhere in the United States. Thirty-four percent favored CLAMP for new nuclear power plants, 52% for waste management facilities, and 50% for new nuclear laboratories. College educated, relatively affluent male whites were the strongest CLAMP supporters. They disproportionately trusted those responsible for the facilities and were not worried about existing nuclear facilities or other local environmental issues. Notably, they were concerned about continuing coal use. Not surprisingly, CLAMP proponents tended to be familiar with their existing local nuclear site. In short, likely CLAMP sites have a large and politically powerful core group to support a CLAMP policy. The challenge to proponents of nuclear technologies will be to sustain this support and expand the base among those who clearly are less connected and receptive to new nearby sites.
 
Article
Data from a human feeding trial with healthy men were used to develop a dose-response model for 13 strains of Salmonella and to determine the effects of strain variation on the shape of the dose-response curve. Dose-response data for individual strains were fit to a three-phase linear model to determine minimum, median, and maximum illness doses, which were used to define Pert distributions in a computer simulation model. Pert distributions for illness dose of individual strains were combined in an Excel spreadsheet using a discrete distribution to model strain prevalence. In addition, a discrete distribution was used to model dose groups and thus create a model that simulated human feeding trials. During simulation of the model with @Risk, an illness dose and a dose consumed were randomly assigned to each consumption event in the simulated feeding trial and if the illness dose was greater than the dose consumed then the model predicted no illness, otherwise the model predicted that an illness would occur. To verify the dose-response model predictions, the original feeding trial was simulated. The dose-response model predicted a median of 69 (range of 43-101) illnesses compared to 74 in the original trial. Thus, its predictions were in agreement with the data used to develop it. However, predictions of the model are only valid for eggnog, healthy men, and the strains and doses of Salmonella used to develop it. When multiple strains of Salmonella were simulated together, the predicted dose-response curves were irregular in shape. Thus, the sigmoid shape of dose-response curves in feeding trials with one strain of Salmonella may not accurately reflect dose response in naturally contaminated food where multiple strains may be present.
 
Article
Sophisticated modeling techniques can be powerful tools to help us understand the effects of cancer control interventions on population trends in cancer incidence and mortality. Readers of journal articles are, however, rarely supplied with modeling details. Six modeling groups collaborated as part of the National Cancer Institute's Cancer Intervention and Surveillance Modeling Network (CISNET) to investigate the contribution of U.S. tobacco-control efforts toward reducing lung cancer deaths over the period 1975-2000. The six models included in this monograph were developed independently and use distinct, complementary approaches toward modeling the natural history of lung cancer. The models used the same data for inputs, and agreed on the design of the analysis and the outcome measures. This article highlights aspects of the models that are most relevant to similarities of or differences between the results. Structured comparisons can increase the transparency of these complex models.
 
Article
Staphylococcus aureus is a gram-positive, enterotoxin-producing coccus. It is a hardy organism and known to survive over a wide range of water activities, pH values, and temperatures. The objective of this study was to model the survival or gradual inactivation of S. aureus ATCC 13565 in intermediate moisture foods (IMFs). Various initial concentrations (approximately 10(1), 10(2), 10(3), and 10(4) CFU/g) were used to inoculate three different IMFs (beefsteak, bread, and chicken pockets). Viable counts were determined up to 60 days using tryptic soy agar. Inoculum size did not influence the survival or gradual inactivation of S. aureus in these foods. The rate of change (increase or decrease) in log CFU/day was calculated for every consecutive pair of data points and by linear regression for each inactivation curve. Both consecutive pair and linear regression rates of change were fit to logistic distributions (with parameters alpha and beta) for each food. Based on the distribution parameters, survival or gradual inactivation of S. aureus was predicted by computer simulation. The simulations indicated an overall decline in S. aureus population over time, although a small fraction of samples in the consecutive pair simulation showed a slight population increase even after 60 days, consistent with the observed data. Simulation results were compared to predictions from other computer models. The models of Stewart et al., were fail-safe, predicting the possibility of significant growth only after > 3,000 days. The USDA pathogen modeling program predictions were found to be fail-dangerous, predicting declines at least four times faster than observed.
 
Article
The relationship between smoking and lung cancer is well established and cohort studies provide estimates of risk for individual cohorts. While population trends are qualitatively consistent with smoking trends, the rates do not agree well with results from analytical studies. Four carcinogenesis models for the effect of smoking on lung cancer mortality were used to estimate lung cancer mortality rates for U.S. males: two-stage clonal expansion and multistage models using parameters estimated from two Cancer Prevention Studies (CPS I and CPS II). Calibration was essential to adjust for both shift and temporal trend. The age-period-cohort model was used for calibration. Overall, models using parameters derived from CPS I performed best, and the corresponding two-stage clonal expansion model was best overall. However, temporal calibration did significantly improve agreement with the population rates, especially the effect of age and cohort.
 
Article
Benefit-cost and risk analyses associated with agency rule making are required by presidential order and are reviewed by the Office of Management and Budget's (OMB) Office of Information and Regulatory Affairs (OIRA). This White House oversight of agency regulatory analysis means that OIRA's work, though primarily analytic, competes daily with the intense political pressures that characterize all work in the White House. Even in the face of the demands of politics, analysis is respected and utilized. A balance between politics and analysis is maintained in several ways. Several structural characteristics of the White House help protect analysis, such as a strong complement of offices whose role is primarily analytic (for example, the Council of Economic Advisors, the Office of Science and Technology Policy, and OMB). OIRA's ability to successfully coordinate its regulatory review with White House officials ensures that it serves as an agent for presidential regulatory policy at the same time it champions higher quality benefit-cost and risk analysis. This role as intermediary between analytic and political judgment also results from OIRA's expertise in such analysis and its reputation for the discretion necessary to maintain the trust of the president's upper level staff.
 
Article
Average rates of total dermal uptake (K(up) ) from short-term (e.g., bathing) contact with dilute aqueous organic chemicals (DAOCs) are typically estimated from steady-state in vitro diffusion-cell measures of chemical permeability (K(p) ) through skin into receptor solution. Widely used ("PCR-vitro") methods estimate K(up) by applying diffusion theory to increase K(p) predictions made by a physico-chemical regression (PCR) model that was fit to a large set of K(p) measures. Here, K(up) predictions for 18 DAOCs made by three PCR-vitro models (EPA, NIOSH, and MH) were compared to previous in vivo measures obtained by methods unlikely to underestimate K(up) . A new PCR model fit to all 18 measures is accurate to within approximately threefold (r= 0.91, p < 10(-5) ), but the PCR-vitro predictions (r > 0.63) all tend to underestimate the K(up) measures by mean factors (UF, and p value for testing UF = 1) of 10 (EPA, p < 10(-6) ), 11 (NIOSH, p < 10(-8) ), and 6.2 (MH, p= 0.018). For all three PCR-vitro models, log(UF) correlates negatively with molecular weight (r(2) = 0.31 to 0.84, p= 0.017 to < 10(-6) ) but not with log(vapor pressure) as an additional predictor (p > 0.05), so vapor pressure appears not to explain the significant in vivo/PCR-vitro discrepancy. Until this discrepancy is explained, careful in vivo measures of K(up) should be obtained for more chemicals, the expanded in vivo database should be compared to in vitro-based predictions, and in vivo data should be considered in assessing aqueous dermal exposure and its uncertainty.
 
Article
Traditional additivity models provide little flexibility in modeling the dose-response relationships of the single agents in a mixture. While the flexible single chemical required (FSCR) methods allow greater flexibility, its implicit nature is an obstacle in the formation of the parameter covariance matrix, which forms the basis for many statistical optimality design criteria. The goal of this effort is to develop a method for constructing the parameter covariance matrix for the FSCR models, so that (local) alphabetic optimality criteria can be applied. Data from Crofton et al. are provided as motivation; in an experiment designed to determine the effect of 18 polyhalogenated aromatic hydrocarbons on serum total thyroxine (T(4) ), the interaction among the chemicals was statistically significant. Gennings et al. fit the FSCR interaction threshold model to the data. The resulting estimate of the interaction threshold was positive and within the observed dose region, providing evidence of a dose-dependent interaction. However, the corresponding likelihood-ratio-based confidence interval was wide and included zero. In order to more precisely estimate the location of the interaction threshold, supplemental data are required. Using the available data as the first stage, the Ds-optimal second-stage design criterion was applied to minimize the variance of the hypothesized interaction threshold. Practical concerns associated with the resulting design are discussed and addressed using the penalized optimality criterion. Results demonstrate that the penalized Ds-optimal second-stage design can be used to more precisely define the interaction threshold while maintaining the characteristics deemed important in practice.
 
Article
Vinyl chloride (VC) was used as a propellant in a limited percentage of aerosol hairspray products in the United States from approximately 1967 to 1973. The question has arisen whether occupational exposures of hairdressers to VC-containing hairsprays in hair salons were sufficient to increase the risk for developing hepatic angiosarcoma (HAS). Transient two-zone and steady-state three-zone models were used to estimate the historical airborne concentration of VC for individual hairdressers using hairspray as well as estimated contributions from other hairdressers in the same salon. Concentrations of VC were modeled for small, medium, and large salons, as well as a representative home salon. Model inputs were determined using published literature, and variability in these inputs was also considered using Monte Carlo techniques. The 95th percentile for the daily time-weighted average exposure for small, medium, and large salons, assuming a market-share fraction of VC-containing hairspray use from the Monte Carlo analysis, was about 0.3 ppm, and for the home salon scenario was 0.1 ppm. The 95th percentile value for the cumulative lifetime exposure of the hairdressers was 2.8 ppm-years for the home salon scenario and 2.0 ppm-years for the small, medium, and large salon scenarios. If using the assumption that all hairsprays used in a salon contained VC, the 95th percentile of the theoretical lifetime cumulative dose was estimated to be 52-79 ppm-years. Estimated lifetime doses were all below the threshold dose for HAS of about 300 to 500 ppm-years reported in the published epidemiology literature.
 
Article
Worldwide data on terrorist incidents between 1968 and 2004 gathered by the RAND Corporation and the Oklahoma City National Memorial Institute for the Prevention of Terrorism (MIPT) were assessed for patterns and trends in morbidity/mortality. Adjusted data analyzed involve a total of 19,828 events, 7,401 "adverse" events (each causing >or= 1 victim), and 86,568 "casualties" (injuries), of which 25,408 were fatal. Most terror-related adverse events, casualties, and deaths involved bombs and guns. Weapon-specific patterns and terror-related risk levels in Israel (IS) have differed markedly from those of all other regions combined (OR). IS had a fatal fraction of casualties about half that of OR, but has experienced relatively constant lifetime terror-related casualty risks on the order of 0.5%--a level 2 to 3 orders of magnitude more than those experienced in OR that increased approximately 100-fold over the same period. Individual event fatality has increased steadily, the median increasing from 14% to 50%. Lorenz curves obtained indicate substantial dispersion among victim/event rates: about half of all victims were caused by the top 2.5% (or 10%) of harm-ranked events in OR (or IS). Extreme values of victim/event rates were approximated fairly well by generalized Pareto models (typically used to fit to data on forest fires, sea levels, earthquakes, etc.). These results were in turn used to forecast maximum OR- and IS-specific victims/event rates through 2080, illustrating empirically-based methods that could be applied to improve strategies to assess, prevent, and manage terror-related risks and consequences.
 
General formulation of CISNET models that can handle a wide range of cancer control intervention inputs.
Article
To better understand the contribution of cigarette smoking, and its changing role in lung cancer, this article provides an introduction to a special issue of Risk Analysis, which considers the relationship between smoking and lung cancer death rates during the period 1975-2000 for U.S. men and women aged 30-84 years. Six models are employed, which are part of a consortium of lung cancer modelers funded by National Cancer Institute's Cancer Intervention and Surveillance Modeling Network (CISNET). Starting with birth-cohort-specific smoking histories derived from National Health Interview Surveys, three scenarios are modeled: Actual Tobacco Control (observed trends in smoking), Complete Tobacco Control (a counterfactual lower bound on smoking rates that could have been achieved had all smoking ceased after the first Surgeon General's report in 1964), and No Tobacco Control (a counterfactual upper bound on smoking rates if smoking patterns that prevailed before the first studies in the 1950s began to inform the public about the hazards of smoking). Using these three scenarios and the lung cancer models, the number and percentage of lung cancer deaths averted from 1975-2000, among all deaths that could have been averted if tobacco control efforts been immediate and perfect, can be estimated. The variability of the results across multiple models provides a measure of the robustness of the results to model assumptions and structure. The results provide not only a portrait of the achieved impact of tobacco control on lung cancer mortality, but also the bounds of what still needs to be achieved.
 
Shared process flow used by all models. Population and smoking inputs were used to develop the smoking history generator, which, in turn, simulates detailed individual-level smoking and other-cause mortality histories. These individual histories are used by each of the modeling groups to generate lung cancer mortality rates in the population.
U.S. population percentage of current smokers by gender and birth cohort for three different tobacco control scenarios. This is one of the outputs that can be generated from the smoking history generator. The output from the actual tobacco control scenario describes the observed data well (not shown).
Lung cancer mortality rates, standardized to the 2000 U.S. standard population, and crude counts for tobacco control scenarios as predicted by the Yale model.
Comparison of model results for actual and potential cumulative lung cancer deaths avoided during the period 1975–2000, shown on equal scales.E, Erasmus MC; F, Fred Hutchinson Cancer Research Center; M, Massachussetts General Hospital–Harvard Medical School; P, Pacific Institute for Research and Evaluation; R, Rice University-MD Anderson Cancer Center; Y, Yale University.
Article
A consortium of six research groups estimated the impact on lung cancer mortality of changes in smoking behavior that began around the publication of the Surgeon General's report (SGR). This chapter presents the results of that effort. We quantified the cumulative impact of changes in smoking behaviors on lung cancer mortality in the United States over the period 1975-2000. The six groups used common inputs and independent models to estimate the number of U.S. lung cancer deaths averted over the period 1975-2000 as a result of changes in smoking behavior beginning in the mid fifties, and the number of deaths that could have been averted if tobacco control had completely eliminated all smoking following issuance of the first SGR on Smoking and Health in 1964. Approximately 795,000 deaths (550,000 men and 245,000 women) were averted over the period 1975-2000 as a result of changes in smoking behavior since in 1950s. In the year 2000 alone approximately 70,000 lung cancer deaths were averted (44,000 among men and 26,000 among women). However, these represent approximately 30% of lung cancer deaths that could have potentially been averted over the period 1975-2000 if smoking was eliminated completely. In the 10-year period 1991-2000, this fraction increased to about 37%. Our results show the substantial impact of changes in smoking behavior since the 1950s. Despite a major impact of changing smoking behaviors, tobacco control effort are still needed to further reduce the burden of this disease.
 
Article
Based on results reported from the NHANES II Survey (the National Health and Nutrition Examination Survey II) for people living in the United States during 1976-1980, we use exploratory data analysis, probability plots, and the method of maximum likelihood to fit lognormal distributions to percentiles of body weight for males and females as a function of age from 6 months through 74 years. The results are immediately useful in probabilistic (and deterministic) risk assessments.
 
Article
Utility theory assumes that people seek personally optimal solutions, are self-interested, forward looking, and rely on consistent and rational decision-making processes. If that were the case, and if adequate information were available at no or little cost, then more than 46% of the U.S. adult population would have a last will and testament, including some people with considerable resources who do not;(18) every driver would use seatbelts; no one would smoke (unless they had strong biochemical and genetic information that they were smoke-tolerant), people living in areas that are prone to natural hazard events would have insurance, and so on. Amos Tversky and Daniel Kahneman's prospect theory posits that, in the face of uncertain risks, people group and order options and then create individual value functions for each option.(19,20) These functions are nonlinear (and steeper for losses than for gains) because we mind losing what we already have more than we mind not winning what we do not yet have. Instead of focusing on final outcomes such as accumulated wealth, people focus on changes around a reference point—gains and losses from their personal status quo point or other aspiration level. Furthermore, low probability outcomes are overweighted and moderate to high ones are underweighted, although certainty is overweighted compared to near-certainty. Prospect theory has been featured primarily in the economics literature, and indeed Kahneman and Tversky's 1979 paper is the most cited paper in Econometrica.(19)
 
Article
Generally, hazards research and literature has treated natural and technological disasters as separate entities. This study attempts to determine how frequently interaction between these two types of disaster took place in the United States from 1980-1989. Data were collected by performing a literature review, contacting organizations and individuals active in hazards research and mitigation, and through a questionnaire sent to the emergency management agencies of all 50 states. The consensus derived from the data is that the number of incidents where natural and technological disasters interact is rising while preparations, which recognize the complications inherent in such combined events, remain cursory. There is a pressing need for states to record, and make available to managers, information regarding the number of combined natural/technological events affecting their areas. Only when such data are available will it be possible to make appropriate decisions regarding the best way to reduce the effects of a natural disaster causing a catastrophic release of hazardous materials.
 
Article
In early 1979 Robert B. Cumming recognized the growing need for risk researchers and practitioners to publish their work in a dedicated professional journal. This led to the formation of an organization to support such a journal, with the Certificate of Incorporation for the Society for Risk Analysis (SRA) made official on August 28, 1980. The first issue of Risk Analysis: An International Journal appeared in March 1981. This article reviews the history of the SRA's first 25 years. It reviews the SRA's formation and growth, provides analyses of its major products (the journal, newsletter, conferences, and meetings), and discusses its impact. This article relies on published literature and a history of the SRA's first 20 years written by two of the authors. This history covers the SRA's many successes, which demonstrate the strength and vitality of the organization and provide optimism for its future. These successes include this journal, which published over 2,150 papers between March 1981 and this December 2005 issue. The successes also include its stable membership of approximately 2,000 members from 43 countries, well-attended annual meetings, and increasing support for true international growth as demonstrated by international risk forums like the World Congress on Risk held in Brussels in June 2003. Similarly, the history also covers the SRA's challenges and difficulties, with the recognition that these provide both an important context about the organization and the opportunity to learn from past experiences. These include the challenges associated with spin-off organizations that decreased the SRA membership in some disciplinary areas, notably in engineering and exposure assessment. This history also includes quantitative analyses of the contents of the first 25 years of Risk Analysis: An International Journal. The results show significant growth in the number of articles published each year, starting with approximately 30 articles published in the first few years to over 120 articles per year now. They also show a relatively even distribution of articles in the life, physical, and social sciences, which demonstrate the sustained commitment of the SRA and the journal to support interdisciplinary risk-related research. The SRA organizational structure currently includes two sections (SRA-Europe and SRA-Japan), 22 Chapters, and nine Specialty Groups, and the structure remains somewhat in flux. We present this history in five sections that cover major themes: (1) SRA formation, (2) membership and organization, (3) publications, (4) meetings, and (5) thematic issues. Like any organization of its size, the SRA boasts a long and diverse history, and no article can possibly capture it all. We hope that in documenting the first 25 years, we strengthen the SRA by providing some perspective on its roots and a rigorous quantitative analysis of some of its products.
 
Article
The potentially huge financial liability due to asbestos product suits and the resulting filings for reorganization in bankruptcy by Manville, UNR Industries, Inc., and Amatex, has become a major public policy concern. In response to the problem several bills have been introduced in the Congress to provide compensation for asbestos (and other occupational disease) victims. This paper estimates the cost of compensating asbestos victims under the provisions of the “Occupational Disease Compensation Act of 1983,” introduced by Congressman George Miller. Utilizing fatality projections from studies by Enterline, Selikoff, and Walker, and assumptions regarding likely claims filing and success rates, duration and degree of disability, and medical expenses, first year costs for this legislation are estimated to range from a low of $131 million to a high of $1.9 billion. Present value cost estimates at a 2% real discount rate range from $3 billion to $56 billion. The paper also estimates the impact of possible modifications to the compensation provisions of the legislation. Reducing medical payments by the amount received from medicare would lower costs by 3–4%. Providing survivors with a 3-year lump sum benefit rather than a 5-year lump sum payment would save 20–25% as would offsetting the 5-year lump sum by expected social security old age and disability benefits. Combining all of these changes would reduce costs by almost 50%.
 
A diachronic model of circuits of culture (developed from Johnson, 1986; see Du Gay, 1997 for different formulation). 
Distribution of newspaper articles on climate change: 1985–2003.
Article
This article argues for a cultural perspective to be brought to bear on studies of climate change risk perception. Developing the "circuit of culture" model, the article maintains that the producers and consumers of media texts are jointly engaged in dynamic, meaning-making activities that are context-specific and that change over time. A critical discourse analysis of climate change based on a database of newspaper reports from three U.K. broadsheet papers over the period 1985-2003 is presented. This empirical study identifies three distinct circuits of climate change-1985-1990, 1991-1996, 1997-2003-which are characterized by different framings of risks associated with climate change. The article concludes that there is evidence of social learning as actors build on their experiences in relation to climate change science and policy making. Two important factors in shaping the U.K.'s broadsheet newspapers' discourse on "dangerous" climate change emerge as the agency of top political figures and the dominant ideological standpoints in different newspapers.
 
Article
The precautionary principle calls on decisionmakers to take preventive action in light of evidence indicating that there is a potential for harm to public health and the environment, even though the nature and magnitude of harm are not fully understood scientifically. Critics of the precautionary principle frequently argue that unbridled application of the principle leads to unintended damage to health and ecosystems (risk tradeoffs) and that precautious decision making leaves us vulnerable to "false-positive" risks that divert resources away from "real risks." The 1991 cholera epidemic in Peru is often cited as an example of these pitfalls of the precautionary principle. It has been mistakenly argued that application of the precautionary principle caused decisionmakers to stop chlorinating the water supply due to the risks of disinfection byproducts (DBPs), resulting in the epidemic. Through analyses of investigations conducted in the cities of Iquitos and Trujillo, Peru, literature review, and interviews with leading Peruvian infectious disease researchers, we determined that the epidemic was caused by a much more complex set of circumstances, including poor sanitation conditions, poor separation of water and waste streams, and inadequate water treatment and distribution systems. The evidence indicates that no decision was made to stop chlorinating on the basis of DBP concerns and that concerns raised about DBPs masked more important factors limiting expansion of chlorination. In fact, outside of Peru's capital Lima, chlorination of drinking water supplies at the time of the epidemic was limited at best. We conclude that the Peruvian cholera epidemic was not caused by a failure of precaution but rather by an inadequate public health infrastructure unable to control a known risk: that of microbial contamination of water supplies.
 
Article
This article develops and fits probability distributions for the variability in projected (total) job tenure for adult men and women in 31 industries and 22 occupations based on data reported by the U.S. Department of Labor's Bureau of Labor Statistics. It extends previously published results and updates those results from January 1987 to February 1996. The model provides probability distributions for the variability in projected (total) job tenures within the time range of the data, and it extrapolates the distributions beyond the time range of the data, i.e., beyond 25 years.
 
Article
The conceptual and computational structure of a performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) is described. Important parts of this structure are (1) maintenance of a separation between stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, with stochastic uncertainty arising from the many possible disruptions that could occur over the 10,000-year regulatory period that applies to the WIPP, and subjective uncertainty arising from the imprecision with which many of the quantities required in the analysis are known, (2) use of Latin hypercube sampling to incorporate the effects of subjective uncertainty, (3) use of Monte Carlo (i.e., random) sampling to incorporate the effects of stochastic uncertainty, and (4) efficient use of the necessarily limited number of mechanistic calculations that can be performed to support the analysis. The WIPP is under development by the U.S. Department of Energy (DOE) for the geologic (i.e., deep underground) disposal of transuranic (TRU) waste, with the indicated PA supporting a Compliance Certification Application (CCA) by the DOE to the U.S. Environmental Protection Agency (EPA) in October 1996 for the necessary certifications for the WIPP to begin operation. The EPA certified the WIPP for the disposal of TRU waste in May 1998, with the result that the WIPP will be the first operational facility in the United States for the geologic disposal of radioactive waste.
 
Article
Whether and to what extent contaminated sites harm ecologic and human health are topics of considerable interest, but also considerable uncertainty. Several federal and state agencies have approved the use of some or many aspects of probabilistic risk assessment (PRA), but its site-specific application has often been limited to high-profile sites and large projects. Nonetheless, times are changing: newly developed software tools, and recent federal and state guidance documents formalizing PRA procedures, now make PRA a readily available method of analysis for even small-scale projects. This article presents and discusses a broad review of PRA literature published since 2000.
 
Top-cited authors
Paul Slovic
  • University of Oregon
Ortwin Renn
  • Universität Stuttgart
Michael Siegrist
Baruch Fischhoff
  • Carnegie Mellon University
Robert Lloyd Goble
  • Clark University