Risk Analysis

Publisher: Society for Risk Analysis, Wiley

Current impact factor: 2.50

Impact Factor Rankings

2016 Impact Factor Available summer 2017
2014 / 2015 Impact Factor 2.502
2013 Impact Factor 1.974
2012 Impact Factor 2.278
2011 Impact Factor 2.366
2010 Impact Factor 2.096
2009 Impact Factor 1.953
2008 Impact Factor 1.831
2007 Impact Factor 1.784
2006 Impact Factor 1.938
2005 Impact Factor 1.51
2004 Impact Factor 1.321
2003 Impact Factor 1.064

Impact factor over time

Impact factor
Year

Additional details

5-year impact 2.54
Cited half-life 9.40
Immediacy index 0.27
Eigenfactor 0.01
Article influence 1.00
Other titles Risk analysis (Online), Risk analysis
ISSN 1539-6924
OCLC 45175725
Material type Document, Periodical, Internet resource
Document type Internet Resource, Computer File, Journal / Magazine / Newspaper

Publisher details

Wiley

  • Pre-print
    • Author can archive a pre-print version
  • Post-print
    • Author cannot archive a post-print version
  • Restrictions
    • 2 years embargo
  • Conditions
    • Some journals have separate policies, please check with each journal directly
    • On author's personal website, institutional repositories, arXiv, AgEcon, PhilPapers, PubMed Central, RePEc or Social Science Research Network
    • Author's pre-print may not be updated with Publisher's Version/PDF
    • Author's pre-print must acknowledge acceptance for publication
    • Non-Commercial
    • Publisher's version/PDF cannot be used
    • Publisher source must be acknowledged with citation
    • Must link to publisher version with set statement (see policy)
    • If OnlineOpen is available, BBSRC, EPSRC, MRC, NERC and STFC authors, may self-archive after 12 months
    • If OnlineOpen is available, AHRC and ESRC authors, may self-archive after 24 months
    • Publisher last contacted on 07/08/2014
    • This policy is an exception to the default policies of 'Wiley'
  • Classification
    yellow

Publications in this journal

  • Victoria A Johnson · Kevin R Ronan · David M Johnston · Robin Peace
    [Show abstract] [Hide abstract]
    ABSTRACT: A main weakness in the evaluation of disaster education programs for children is evaluators' propensity to judge program effectiveness based on changes in children's knowledge. Few studies have articulated an explicit program theory of how children's education would achieve desired outcomes and impacts related to disaster risk reduction in households and communities. This article describes the advantages of constructing program theory models for the purpose of evaluating disaster education programs for children. Following a review of some potential frameworks for program theory development, including the logic model, the program theory matrix, and the stage step model, the article provides working examples of these frameworks. The first example is the development of a program theory matrix used in an evaluation of ShakeOut, an earthquake drill practiced in two Washington State school districts. The model illustrates a theory of action; specifically, the effectiveness of school earthquake drills in preventing injuries and deaths during disasters. The second example is the development of a stage step model used for a process evaluation of What's the Plan Stan?, a voluntary teaching resource distributed to all New Zealand primary schools for curricular integration of disaster education. The model illustrates a theory of use; specifically, expanding the reach of disaster education for children through increased promotion of the resource. The process of developing the program theory models for the purpose of evaluation planning is discussed, as well as the advantages and shortcomings of the theory-based approaches.
    No preview · Article · Feb 2016 · Risk Analysis
  • Craig W Trumbo · Lori Peek · Michelle A Meyer · Holly L Marlatt · Eve Gruntfest · Brian D McNoldy · Wayne H Schubert
    [Show abstract] [Hide abstract]
    ABSTRACT: The aim of this study was to develop a reliable and valid measure of hurricane risk perception. The utility of such a measure lies in the need to understand how people make decisions when facing an evacuation order. This study included participants located within a 15-mile buffer of the Gulf and southeast Atlantic U.S. coasts. The study was executed as a three-wave panel with mail surveys in 2010-2012 (T0 baseline N = 629, 56%; T1 retention N = 427, 75%; T2 retention N = 350, 89%). An inventory based on the psychometric model was developed to discriminate cognitive and affective perceptions of hurricane risk, and included open-ended responses to solicit additional concepts in the T0 survey. Analysis of the T0 data modified the inventory and this revised item set was fielded at T1 and then replicated at T2 . The resulting scales were assessed for validity against existing measures for perception of hurricane risk, dispositional optimism, and locus of control. A measure of evacuation expectation was also examined as a dependent variable, which was significantly predicted by the new measures. The resulting scale was found to be reliable, stable, and largely valid against the comparison measures. Despite limitations involving sample size, bias, and the strength of some reliabilities, it was concluded that the measure has potential to inform approaches to hurricane preparedness efforts and advance planning for evacuation messages, and that the measure has good promise to generalize to other contexts in natural hazards as well as other domains of risk.
    No preview · Article · Feb 2016 · Risk Analysis
  • Jinshu Cui · Heather Rosoff · Richard S John
    [Show abstract] [Hide abstract]
    ABSTRACT: There is a paucity of research examining public response to the cumulative effects of multiple related extreme events over time. We investigated the separate and combined effects of frequency and trajectory of terrorist attacks. A scenario simulation of a series of gas station bombings in Southern California was developed to evaluate respondents' affect, risk perception, and intended avoidance behavior using a 3 (frequency; low vs. medium vs. high) by 3 (trajectory; increasing vs. constant vs. decreasing) factorial design. For each of the nine conditions, three videos were created to simulate news broadcasts documenting the attacks over a three-week period. A total of 275 respondents were included in the analysis. Results from analysis of covariances (ANCOVAs) indicate that trajectory of the sequential attacks (increasing or decreasing in frequency) predicts negative affect, risk perception, and avoidance behavior. In contrast, frequency predicts neither negative affect, positive affect, risk perception, nor intended avoidance behavior. Results from structural equation modeling (SEM) further indicate that the effect of negative affect on behavioral intention is mediated by risk perception and the effect of trajectory on risk perception is partially mediated by negative affect. In addition, both ANCOVAs and SEM model results suggest that (1) females experience less positive affect and perceive more risk than males, (2) respondents with higher income perceive more risk, and (3) younger respondents are more likely to modify their behavior to avoid the risk of future attacks.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: A Bayesian statistical temporal-prevalence-concentration model (TPCM) was built to assess the prevalence and concentration of pathogenic campylobacter species in batches of fresh chicken and turkey meat at retail. The data set was collected from Finnish grocery stores in all the seasons of the year. Observations at low concentration levels are often censored due to the limit of determination of the microbiological methods. This model utilized the potential of Bayesian methods to borrow strength from related samples in order to perform under heavy censoring. In this extreme case the majority of the observed batch-specific concentrations was below the limit of determination. The hierarchical structure was included in the model in order to take into account the within-batch and between-batch variability, which may have a significant impact on the sample outcome depending on the sampling plan. Temporal changes in the prevalence of campylobacter were modeled using a Markovian time series. The proposed model is adaptable for other pathogens if the same type of data set is available. The computation of the model was performed using OpenBUGS software.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: In response to the European Food Safety Authority's wish to assess the reduction of human cases of salmonellosis by implementing control measures at different points in the farm-to-consumption chain for pork products, a quantitative microbiological risk assessment (QMRA) was developed. The model simulated the occurrence of Salmonella from the farm to consumption of pork cuts, minced meat, and fermented ready-to-eat sausage, respectively, and a dose-response model was used to estimate the probability of illness at consumption. The QMRA has a generic structure with a defined set of variables, whose values are changed according to the E.U. member state (MS) of interest. In this article we demonstrate the use of the QMRA in four MSs, representing different types of countries. The predicted probability of illness from the QMRA was between 1 in 100,000 and 1 in 10 million per serving across all three product types. Fermented ready-to-eat sausage imposed the highest probability of illness per serving in all countries, whereas the risks per serving of minced meat and pork chops were similar within each MS. For each of the products, the risk varied by a factor of 100 between the four MSs. The influence of lack of information for different variables was assessed by rerunning the model with alternative, more extreme, values. Out of the large number of uncertain variables, only a few of them have a strong influence on the probability of illness, in particular those describing the preparation at home and consumption.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: As industrial development is increasing near northern Canadian communities, human health risk assessments (HHRA) are conducted to assess the predicted magnitude of impacts of chemical emissions on human health. One exposure pathway assessed for First Nations communities is the consumption of traditional plants, such as muskeg tea (Labrador tea) (Ledum/Rhododendron groenlandicum) and mint (Mentha arvensis). These plants are used to make tea and are not typically consumed in their raw form. Traditional practices were used to harvest muskeg tea leaves and mint leaves by two First Nations communities in northern Alberta, Canada. Under the direction of community elders, community youth collected and dried plants to make tea. Soil, plant, and tea decoction samples were analyzed for inorganic elements using inductively coupled plasma-mass spectrometry. Concentrations of inorganic elements in the tea decoctions were orders of magnitude lower than in the vegetation (e.g., manganese 0.107 mg/L in tea, 753 mg/kg in leaves). For barium, the practice of assessing ingestion of raw vegetation would have resulted in a hazard quotient (HQ) greater than the benchmark of 0.2. Using measured tea concentrations it was determined that exposure would result in risk estimates orders of magnitude below the HQ benchmark of 0.2 (HQ = 0.0049 and 0.017 for muskeg and mint tea, respectively). An HHRA calculating exposure to tea vegetation through direct ingestion of the leaves may overestimate risk. The results emphasize that food preparation methods must be considered when conducting an HHRA. This study illustrates how collaboration between Western scientists and First Nations communities can add greater clarity to risk assessments.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: In this article we present a model for Salmonella contamination of pig carcasses in the slaughterhouse. This model forms part of a larger QMRA (quantitative microbial risk assessment) on Salmonella in slaughter and breeder pigs, which uses a generic model framework that can be parameterized for European member states, to describe the entire chain from farm-to-consumption and the resultant human illness. We focus on model construction, giving mathematical formulae to describe Salmonella concentrations on individual pigs and slaughter equipment at different stages of the slaughter process. Variability among individual pigs and over slaughterhouses is incorporated using statistical distributions, and simulated by Monte Carlo iteration. We present the results over the various slaughter stages and show that such a framework is especially suitable to investigate the effect of various interventions. In this article we present the results of the slaughterhouse module for two case study member states. The model outcome represents an increase in average prevalence of Salmonella contamination and Salmonella numbers at dehairing and a decrease of Salmonella numbers at scalding. These results show good agreement when compared to several other QMRAs and microbiological studies.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: In August 2012, Hurricane Isaac, a Category 1 hurricane at landfall, caused extensive power outages in Louisiana. The storm brought high winds, storm surge, and flooding to Louisiana, and power outages were widespread and prolonged. Hourly power outage data for the state of Louisiana were collected during the storm and analyzed. This analysis included correlation of hourly power outage figures by zip code with storm conditions including wind, rainfall, and storm surge using a nonparametric ensemble data mining approach. Results were analyzed to understand how correlation of power outages with storm conditions differed geographically within the state. This analysis provided insight on how rainfall and storm surge, along with wind, contribute to power outages in hurricanes. By conducting a longitudinal study of outages at the zip code level, we were able to gain insight into the causal drivers of power outages during hurricanes. Our analysis showed that the statistical importance of storm characteristic covariates to power outages varies geographically. For Hurricane Isaac, wind speed, precipitation, and previous outages generally had high importance, whereas storm surge had lower importance, even in zip codes that experienced significant surge. The results of this analysis can inform the development of power outage forecasting models, which often focus strictly on wind-related covariates. Our study of Hurricane Isaac indicates that inclusion of other covariates, particularly precipitation, may improve model accuracy and robustness across a range of storm conditions and geography.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: The results of a survey and an experiment show that experiential uncertainty-people's experience of uncertainty in risk contexts-plays a moderating role in individuals' risk-related demand for government regulation and trust in risk-managing government institutions. First, descriptions of risks were presented to respondents in a survey (N = 1,017) and their reactions to questions about experiential uncertainty, risk perception, and demand for government regulation were measured, as well as levels of risk-specific knowledge. When experiential uncertainty was high, risk perceptions had a positive relationship with demand for government regulation of risk; no such relationship showed under low experiential uncertainty. Conversely, when people experience little experiential uncertainty, having more knowledge about the risk topic involved was associated with a weaker demand for government regulation of risk. For people experiencing uncertainty, this relationship between knowledge and demand for regulation did not emerge. Second, in an experiment (N = 120), experiential uncertainty and openness in risk communication were manipulated to investigate effects on trust. In the uncertainty condition, the results showed that open versus nonopen government communication about Q-fever-a zoonosis-led to higher levels of trust in the government agency, but not in in the control condition. Altogether, this research suggests that only when people experience relatively little uncertainty about the risk, knowledge provision may preclude them from demanding government action. Also, only when persons experience uncertainty are stronger risk perceptions associated with a demand for government regulation, and they are affected by openness of risk communication in forming institutional trust.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: In statistical applications, logistic regression is a popular method for analyzing binary data accompanied by explanatory variables. But when one of the two outcomes is rare, the estimation of model parameters has been shown to be severely biased and hence estimating the probability of rare events occurring based on a logistic regression model would be inaccurate. In this article, we focus on estimating the probability of rare events occurring based on logistic regression models. Instead of selecting a best model, we propose a local model averaging procedure based on a data perturbation technique applied to different information criteria to obtain different probability estimates of rare events occurring. Then an approximately unbiased estimator of Kullback-Leibler loss is used to choose the best one among them. We design complete simulations to show the effectiveness of our approach. For illustration, a necrotizing enterocolitis (NEC) data set is analyzed.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: Melamine contamination of food has become a major food safety issue because of incidents of infant disease caused by exposure to this chemical. This study was aimed at establishing a safety limit in Taiwan for the degree of melamine migration from food containers. Health risk assessment was performed for three exposure groups (preschool children, individuals who dine out, and elderly residents of nursing homes). Selected values of tolerable daily intake (TDI) for melamine were used to calculate the reference migration concentration limit (RMCL) or reference specific migration limit (RSML) for melamine food containers. The only existing values of these limits for international standards today are 1.2 mg/L (0.2 mg/dm(2) ) in China and 30 mg/L (5 mg/dm(2) ) in the European Union. The factors used in the calculations included the specific surface area of food containers, daily food consumption rate, body weight, TDI, and the percentile of the population protected at a given migration concentration limit (MCL). The results indicate that children are indeed at higher risk of melamine exposure at toxic levels than are other groups and that the 95th percentile of MCL (specific surface area = 5) for children aged 1-6 years should be the RMCL (0.07 mg/dm(2) ) for protecting the sensitive and general population.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: The aim of the project as the cluster analysis was to in part to develop a generic structured quantitative microbiological risk assessment (QMRA) model of human salmonellosis due to pork consumption in EU member states (MSs), and the objective of the cluster analysis was to group the EU MSs according to the relative contribution of different pathways of Salmonella in the farm-to-consumption chain of pork products. In the development of the model, by selecting a case study MS from each cluster the model was developed to represent different aspects of pig production, pork production, and consumption of pork products across EU states. The objective of the cluster analysis was to aggregate MSs into groups of countries with similar importance of different pathways of Salmonella in the farm-to-consumption chain using available, and where possible, universal register data related to the pork production and consumption in each country. Based on MS-specific information about distribution of (i) small and large farms, (ii) small and large slaughterhouses, (iii) amount of pork meat consumed, and (iv) amount of sausages consumed we used nonhierarchical and hierarchical cluster analysis to group the MSs. The cluster solutions were validated internally using statistic measures and externally by comparing the clustered MSs with an estimated human incidence of salmonellosis due to pork products in the MSs. Finally, each cluster was characterized qualitatively using the centroids of the clusters.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: The present program of research synthesizes the findings from three studies in line with two goals. First, the present research explores how the oil and gas industry is performing at risk mitigation in terms of finding and fixing errors when they occur. Second, the present research explores what factors in the work environment relate to a risk-accommodating environment. Study 1 presents a descriptive evaluation of high-consequence incidents at 34 oil and gas companies over a 12-month period (N = 873), especially in terms of those companies' effectiveness at investigating and fixing errors. The analysis found that most investigations were fair in terms of quality (mean = 75.50%), with a smaller proportion that were weak (mean = 11.40%) or strong (mean = 13.24%). Furthermore, most companies took at least one corrective action for high-consequence incidents, but few of these corrective actions were confirmed as having been completed (mean = 13.77%). In fact, most corrective actions were secondary interim administrative controls (e.g., having a safety meeting) rather than fair or strong controls (e.g., training, engineering elimination). Study 2a found that several environmental factors explain the 56.41% variance in safety, including management's disengagement from safety concerns, finding and fixing errors, safety management system effectiveness, training, employee safety, procedures, and a production-over-safety culture. Qualitative results from Study 2b suggest that a compliance-based culture of adhering to liability concerns, out-group blame, and a production-over-safety orientation may all impede safety effectiveness.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: Measures of sensitivity and uncertainty have become an integral part of risk analysis. Many such measures have a conditional probabilistic structure, for which a straightforward Monte Carlo estimation procedure has a double-loop form. Recently, a more efficient single-loop procedure has been introduced, and consistency of this procedure has been demonstrated separately for particular measures, such as those based on variance, density, and information value. In this work, we give a unified proof of single-loop consistency that applies to any measure satisfying a common rationale. This proof is not only more general but invokes less restrictive assumptions than heretofore in the literature, allowing for the presence of correlations among model inputs and of categorical variables. We examine numerical convergence of such an estimator under a variety of sensitivity measures. We also examine its application to a published medical case study.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: We live in an age that increasingly calls for national or regional management of global risks. This article discusses the contributions that expert elicitation can bring to efforts to manage global risks and identifies challenges faced in conducting expert elicitation at this scale. In doing so it draws on lessons learned from conducting an expert elicitation as part of the World Health Organizations (WHO) initiative to estimate the global burden of foodborne disease; a study commissioned by the Foodborne Disease Epidemiology Reference Group (FERG). Expert elicitation is designed to fill gaps in data and research using structured, transparent methods. Such gaps are a significant challenge for global risk modeling. Experience with the WHO FERG expert elicitation shows that it is feasible to conduct an expert elicitation at a global scale, but that challenges do arise, including: defining an informative, yet feasible geographical structure for the elicitation; defining what constitutes expertise in a global setting; structuring international, multidisciplinary expert panels; and managing demands on experts’ time in the elicitation. This article was written as part of a workshop, “Methods for Research Synthesis: A Cross-Disciplinary Approach” held at the Harvard Center for Risk Analysis on October 13, 2013.
    No preview · Article · Feb 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: Spatial and/or temporal clustering of pathogens will invalidate the commonly used assumption of Poisson-distributed pathogen counts (doses) in quantitative microbial risk assessment. In this work, the theoretically predicted effect of spatial clustering in conventional "single-hit" dose-response models is investigated by employing the stuttering Poisson distribution, a very general family of count distributions that naturally models pathogen clustering and contains the Poisson and negative binomial distributions as special cases. The analysis is facilitated by formulating the dose-response models in terms of probability generating functions. It is shown formally that the theoretical single-hit risk obtained with a stuttering Poisson distribution is lower than that obtained with a Poisson distribution, assuming identical mean doses. A similar result holds for mixed Poisson distributions. Numerical examples indicate that the theoretical single-hit risk is fairly insensitive to moderate clustering, though the effect tends to be more pronounced for low mean doses. Furthermore, using Jensen's inequality, an upper bound on risk is derived that tends to better approximate the exact theoretical single-hit risk for highly overdispersed dose distributions. The bound holds with any dose distribution (characterized by its mean and zero inflation index) and any conditional dose-response model that is concave in the dose variable. Its application is exemplified with published data from Norovirus feeding trials, for which some of the administered doses were prepared from an inoculum of aggregated viruses. The potential implications of clustering for dose-response assessment as well as practical risk characterization are discussed.
    No preview · Article · Jan 2016 · Risk Analysis
  • [Show abstract] [Hide abstract]
    ABSTRACT: Dose-response models are essential to quantitative microbial risk assessment (QMRA), providing a link between levels of human exposure to pathogens and the probability of negative health outcomes. In drinking water studies, the class of semi-mechanistic models known as single-hit models, such as the exponential and the exact beta-Poisson, has seen widespread use. In this work, an attempt is made to carefully develop the general mathematical single-hit framework while explicitly accounting for variation in (1) host susceptibility and (2) pathogen infectivity. This allows a precise interpretation of the so-called single-hit probability and precise identification of a set of statistical independence assumptions that are sufficient to arrive at single-hit models. Further analysis of the model framework is facilitated by formulating the single-hit models compactly using probability generating and moment generating functions. Among the more practically relevant conclusions drawn are: (1) for any dose distribution, variation in host susceptibility always reduces the single-hit risk compared to a constant host susceptibility (assuming equal mean susceptibilities), (2) the model-consistent representation of complete host immunity is formally demonstrated to be a simple scaling of the response, (3) the model-consistent expression for the total risk from repeated exposures deviates (gives lower risk) from the conventional expression used in applications, and (4) a model-consistent expression for the mean per-exposure dose that produces the correct total risk from repeated exposures is developed.
    No preview · Article · Jan 2016 · Risk Analysis