Ronald L. Iman

Ronald L. Iman
  • PhD
  • Southwest Technology Consultants

About

118
Publications
60,218
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
15,087
Citations
Current institution
Southwest Technology Consultants

Publications

Publications (118)
Article
Full-text available
The procedure of statistical discrimination is simple in theory but not so simple in practice. An observation x/sub approx.O/, possible multivariate, is to be classified into one of several populations ..pi..â,...,..pi../sub k/, which have, respectively, the density functions fâ(x),...,f/sub k/(x). The decision procedure is to evaluate each densi...
Article
The analysis of probabilistic fault trees often involves the investigation of events that contribute both to the frequency of the top event and to the uncertainty in this frequency. This paper provides a discussion of three measures of the contribution of an event to the total uncertainty in the top event. These measures are known as uncertainty im...
Article
Full-text available
Many different techniques have been proposed for performing uncertainty and sensitivity analyses on computer models for complex processes. The objective of the present study is to investigate the applicability of three widely used techniques to three computer models having large uncertainties and varying degrees of complexity in order to highlight...
Article
The individual plant analyses in the U.S. Nuclear Regulatory Commission's reassessment of the risk from commercial nuclear power plants (NUREG-1150) consist of four parts: systems analysis, accident-progression analysis, source-term analysis, and consequence analysis. Careful definition of the interfaces between these parts is necessary for both in...
Article
Bayesian methods can be very useful in modeling applications used in risk assessments. For example, a Bayesian analysis can be used to provide a probabilistic comparison of different probability models relative to a set of data, as well as to provide uncertainty bounds on the predictions from the various models. For more complex models or composite...
Article
Full-text available
Probabilistic risk assessments (PRAs) of nuclear power plants proceed by modeling potential accident sequences at the plant of interest. These hypothesized accident sequences begin with initiating events. A very important initiating event phenomenon is the loss of off-site power (LOSP). This is the interruption of the preferred power supply to the...
Article
There is a need for plant-specific distributions of incidence and failure rates rather than distributions from pooled data which are based on the “common incidence rate” assumption. The so-called superpopulation model satisfies this need through a practically appealing approach that accounts for the variability over the population of plants. Unfort...
Article
Full-text available
Virtually every aspect of hurricane planning and forecasting involves (or should involve!) the science of statistics. The very active 2004 and 2005 Atlantic hurricane seasons-in particular the devastating landfall of Hurricane Katrina on the Gulf Coast-as well as concerns that climate change is altering hurricane frequency and intensity, provide ma...
Article
Full-text available
Projecting losses associated with hurricanes is a complex and difficult undertaking that is fraught with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates...
Article
Full-text available
Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates...
Article
Full-text available
The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) performs an annual review of computer models that have been submitted by vendors for use in insurance rate filings in Florida. As part of the review process and to comply with the Sunshine Law, the FCHLPM employs a Professional Team to perform on-site (confidential) audits of...
Article
Full-text available
The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) performs an annual review of computer models that have been submitted by vendors for use in insurance rate filling in Florida. As part of the review process and to comply with the Sunshine Law, the FCHLPM employs a Professional Team to perform onsite (confidential) audits of t...
Article
Full-text available
This chapter discusses the use of computer models for such diverse applications as safety assessments for geologic isolation of radioactive waste and for nuclear power plants; loss cost projections for hurricanes; reliability analyses for manufacturing equipment; transmission of HIV; and subsurface storm flow modelling. Such models are usually char...
Article
The Circuit Card Assembly and Materials Task Force (CCAMTF) is producing proof-of-performance data to educate the population and to facilitate change. The CCAMTF developed a three-phase approach for its program. A screening phase was used to downselect alternative surface finishes and to provide guidance on environmental conditions for evaluating t...
Article
This paper reviews the vital role played by cleaning in the manufacture of electronic assemblies and shows why assemblies level cleaning will be alive and well into the foreseeable future. Post assembly cleaning makes parts acquisition handling and soldering a relatively non-demanding and forgiving process. It takes a considerable amount of hassle...
Article
The LRSTF combined the efforts of industry, military, and government to evaluate low-residue soldering processes for military and commercial applications. These processes were selected for evaluation because they provide a means for the military to support the presidential mandate while producing reliable hardware at a lower cost. This report prese...
Conference Paper
Environmental regulations are encouraging the development of new environmentally conscious manufacturing (ECM) processes. However, the quality and reliability of these processes and hardware produced must be understood prior to implementing these new technologies in factories. Furthermore, military hardware fabrication is governed by standards and...
Article
Full-text available
This paper is a presentation made in support of the statistics profession. This field can say it has had a major impact in most major fields of study presently undertaken by man, yet it is not perceived as an important, or critical field of study. It is not a growth field either, witness the almost level number of faculty and new PhD`s produced ove...
Article
Chemical fluxes are typically used during conventional electronic soldering to enhance solder wettability. Most fluxes contain very reactive, hazardous constituents that require special storage and handling. Corrosive flux residues that remain on soldered parts can severely degrade product reliability. The residues are removed with chlorofluorocarb...
Article
Full-text available
Computer models are utilized in such diverse fields as: economic forecasting, transmission of the HIV virus, engineering applications, factory production, weather forecasting, system reliability, mechanical design, and risk assessment. The models used in these applications frequently have many uncertain inputs and produce many different outputs. In...
Article
Full-text available
The performance of a probabilistic risk assessment (PRA) for a nuclear power plant is a complex undertaking, involving the assembly of an accident frequency analysis, an accident progression analysis, a source term analysis, and a consequence analysis. Each of these analyses is, in itself, quite complex. Uncertainties enter into a PRA from each of...
Article
The individual plant analyses in the US Nuclear Regulatory Commission's reassessment of the risk from commercial nuclear power plants (NUREG-1150) consist of four parts: systems analysis, accident progression analysis, source term analysis, and consequence analysis. Careful definition of the interfaces between these parts is necessary for both info...
Article
This document has been designed for users of the Probabilistic Risk Assessment Model Integration System (PRAMIS) computer developed by the authors at Sandia National Laboratories for easy assembly of the individual parts of the NUREG-1150 plant analyses into overall risk results. PRAMIS assembles the following files associated with the NUREG-1150 a...
Chapter
Probabilistic Risk Assessment (PRA) plays an important role in the nuclear reactor regulatory process, and the assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. However, uncertainty analysis and sensitivity analysis in the context of PRA are relatively immature fields. A revie...
Article
Full-text available
Risk analysis of nuclear power generation often requires the use of expert opinion to provide probabilistic inputs where other sources of information are unavailable or are not cost effective. In the Reactor Risk Reference Document (NUREG-1150), a methodology for the collection of expert opinion was developed. Earlier criticisms pointed out the nee...
Article
Full-text available
Rank tests provide an alternative to the usual normal theory F-test for the analysis of data from randomized complete blocks experiments. Two such rank tests are the Friedman test which employs the method of n-rankings and the rank transformation procedure which employs an overall ranking of the data. In this paper the asymptotic efficiency of the...
Article
Full-text available
Industry data representing the time to recovery of loss of off-site power at nuclear power plants for 63 incidents caused by plant-centered losses, grid losses, or severe weather losses are fit with exponential, lognormal, gamma and Weibull probability models. A Bayesian analysis is used to compare the adequacy of each of these models and to provid...
Article
This report contains risk assessment methodology developed for use in assessing the risk from the disposal of radioactive wastes in deep geologic formations. This methodology consists of techniques for selecting and screening scenarios, models for use in simulating the physical processes and estimating the consequences associated with the occurrenc...
Article
Full-text available
Many situations exist in which n objects are ranked by two or more independent sources, where interest centers primarily on agreement in the top rankings and disagreements on items at the bottom of the rankings are of little or no importance. A problem with Spearman's rho or Kendall's coefftcient of concordance in this setting is that they are equa...
Article
The analysis of technological risks must often proceed with sparse data concerning the systems being evaluated. To expand the empirical basis from which conclusions are reached, data concerning similar systems, or similar components within other systems, are used to estimate the probabilities of various safeguard failures. New systems and component...
Article
System unavailabilities for large complex systems such as nuclear power plants are often evaluated through use of fault tree analysis. The system unavailability is obtained from a Boolean representation of a system fault tree. Even after truncation of higher order terms these expressions can be quite large, involving thousands of terms. A general m...
Article
Full-text available
Iman and Connver (1985, 1987) have suggested the top-down correlation coefficient as a measure of association when n objects are ranked by two or more independent sources and interest centers primarily on agreement in the top rankings, with disagreements on items at the bottom of the rankings being of little or no importance. The top-down correlati...
Article
Full-text available
The MAEROS aerosol model is being incorporated into the MELCOR code system for the calculation of risk from severe reactor accidents. To gain insight to assist in this incorporation, a computational test problem involving a three-component aerosol in the upper plenum of a pressurized water reactor was analyzed with MAEROS. The following topics were...
Conference Paper
The calculation of risk and the propagation of uncertainties in the US Nuclear Regulatory Commission's reassessment of risk from commercial nuclear power stations (i.e., NUREG-1150) is described. The overall integration of the analysis performed for each nuclear power station considered in NUREG-1150 is based on: (1) relatively fast-running models...
Article
This document has been designed for users of the computer program, Top Event Matrix Analysis Code (TEMAC), developed by the authors at Sandia National Laboratories for estimating risk, and performing uncertainty and sensitivity analyses with a Boolean expression such as produced by the Set Equation Transformation System (SETS) computer program (Wor...
Article
Full-text available
A computational test problem for the MAEROS aerosol model is used to illustrate the application of uncertainty/sensitivity analysis techniques based on Latin hypercube sampling and regression analysis to aggregation problems. The test problem involves a five-component aerosol in the containment of a pressurized water reactor. The following topics a...
Article
Full-text available
An uncertainty and sensitivity analysis of the MAEROS model for multicomponent aerosol dynamics is presented. Analysis techniques based on Latin hypercube sampling and regression analysis are used to study the behavior of a two-component aerosol in a nuclear power plant containment for a transient accident with loss of alternating current power (i....
Article
A comparison of two methodologies for the analysis of uncertainty in risk analyses is presented. One methodology combines approximate methods for confidence interval estimation of system reliability with a bounding approach for information derived from expert opinion. The other method employs Bayesian arguments to construct probability distribution...
Article
Computer models for various applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model, and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that recei...
Article
The asymptotic behavior of a linear compartment model for the environmental movement of radionuclides is investigated. Here, the expression asymptotic behavior is used to designate the behavior of q(t) as t → ∞, where q is the solution of a vector differential equation of the form dq/dt = h + Kq. The asymptotic behavior of such equations is describ...
Article
Full-text available
This document is for users of a computer program developed by the authors at Sandia National Laboratories. The computer program is designed to be used in conjunction with sensitivity analyses of complex computer models. In particular, this program is most useful in analyzing input-output relationships when the input has been selected using the Lati...
Conference Paper
Full-text available
MARCH 2 calculations for the TMLB' accident (a transient with loss of all ac power) at Surry provide a test bed for addressing problems with computer models in an analysis situation typical of those expected in applications of the MELCOR code system for severe accident analysis. This paper describes a sensitivity analysis for the TMLB' accident seq...
Conference Paper
Full-text available
The techniques for performing uncertainty/sensitivity analyses compiled as part of the MELCOR program appear to be well suited for use with a health and economic consequence model. Two replicate samples of size 50 gave essentially identical results, indicating that for this case, a Latin hypercube sample of size 50 seems adequate to represent the d...
Article
Full-text available
The Friedman test (or sign test when k = 2) depends entirely on within-block rankings. In a recent paper, Quade (1979) attempted to provide a test with more power than the Friedman test by considering a k-sample extension of the Wilcoxon signed ranks test. This is done by taking advantage of the between-block information. A third way to approach th...
Article
Full-text available
This document has been designed for users of the computer program developed by the authors at Sandia National Laboratories for the generation of either Latin hypercube or random multivariate samples. The Latin hypercube technique employs a constrained sampling scheme, whereas random sampling corresponds to a simple Monte Carlo technique. The genera...
Conference Paper
In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. The goal of uncertainty analysis is to measure the imprecision in PRA outc...
Article
Full-text available
The rank transformation refers to the replacement of data by their ranks, with a subsequent analysis using the usual normal theory procedure, but calculated on the ranks rather than on the data. Rank transformation procedures have previously been shown by the authors to have properties of robustness and power in both regression and analysis of vari...
Article
Results are presented from a sensitivity analysis study of a model developed to represent the environmental movement of radionuclides. This model is designated the Environmental Transport Model. The study has three purposes: (1) to develop sensitivity analysis techniques applicable to the Environmental Transport Model, (2) to provide insight and ex...
Article
Full-text available
This report contains an explanation of an algorithm that, when executed, will operate on any symmetric approximate correlation matrix by iteratively adjusting the eigenvalues of this matrix. The objective of this algorithm is to produce a valid, positive definite, correlation matrix. Also a description of a program (called POSDEF) which implements...
Article
Graphs are presented on which the empirical distribution function can be plotted to test the assumption of normality by the Lilliefors test. A second set of graphs is presented for using the Lilliefors test on exponential distributions. The graphs allow for tests at the 10 percent, 5 percent, and 1 percent levels of significance. Use of these graph...
Article
Full-text available
A method for inducing a desired rank correlation matrix on multivariate input vectors for simulation studies has recently been developed by Iman and Conover (1982). The primary intention of this procedure is to produce correlated input variables for use with computer models. Since this procedure is distribution free and allows the exact marginal di...
Article
Full-text available
A method for inducing a desired rank correlation matrix on a multivariate input random variable for use in a simulation study is introduced in this paper. This method is simple to use, is distribution free, preserves the exact form of the marginal distributions on the input variables, and may be used with any type of sampling scheme for which corre...
Article
Full-text available
Many different projects at Sandia National Laboratories use mathematical models to represent different physical processes which are them implemented on a computer. One such project is the development of a methodology to assess the risk associated with the geologic isolation of radioactive waste. Models are used for physical processes such as subsur...
Article
Full-text available
A computer model for a physical process produces a time dependent output Z based on a particular selection of input values X/sub l/, ..., X/sub k/. For the case considered herein it is required to make a number of computer runs n and use these results to estimate the distribution function for the output Z and to quantify the uncertainty as reflecte...
Article
Full-text available
This is the second part of a two-part paper presenting a statistical approach to the sensitivity analysis of computer models. In this part consideration is given to response surface construction techniques including identification of possible overfit and data transformation to handle nonlinearities. Methods for response surface validation are prese...
Article
Full-text available
Many of the more useful and powerful nonparametric procedures may be presented in a unified manner by treating them as rank transformation procedures. Rank transformation procedures are ones in which the usual parametric procedure is applied to the ranks of the data instead of to the data themselves. This technique should be viewed as a useful tool...
Article
Full-text available
This is the first part of a two-part article presenting a statistical approach to the sensitivity analysis of computer models. Part I defines the objectives of sensitivity analysis and presents a computer model that is used for purposes of illustration in the paper. A discussion of the necessary considerations for selection of sample values of vari...
Conference Paper
The Environmental Transport Model is used to represent the surface movement of radionuclides. This presentation provides an indication of the results obtained in a study of the asymptotic behavior of this model. The nature of the Environmental Transport Model is indicated and its asymptotic behavior is discussed. Then, an approach to sensitivity an...
Article
Full-text available
A method for inducing a desired rank correlation matrix on a multivariate input random variable is introduced in this paper. This method is simple to use, is distribution free, preserves the exact form of the marginal distributions on the input variables, and may be used with any type of sampling scheme for which correlation of input variables is a...
Article
Full-text available
As modeling efforts expand to a broader spectrum of areas the amount of computer time required to exercise the corresponding computer codes has become quite costly. This costly process can be directly tied to the complexity of the modeling, which makes the relationships among the input variables not mathematically tractable. In this setting it is d...
Book
Full-text available
This document is designed for users of the program developed at Sandia Laboratories by the authors to generate Latin hypercube samples. Latin hypercube sampling is a recently developed sampling technique for generating input vectors into computer models for purposes of sensitivity analysis studies. In addition to providing a cost-effective and reli...
Article
Full-text available
The Friedman test for the randomized complete block design is used to test the hypothesis of no treatment effect among k treatments with b blocks. Difficulty in determination of the size of the critical region for this hypothesis is compounded by the facts that the most recent extension of exact tables for the distribution of the test statistic by...
Article
Full-text available
As modeling efforts expand to a broader spectrum of areas the amount of computer time required to exercise the corresponding computer codes has become quite costly (several hours for a single run is not uncommon). This costly process can be directly tied to the complexity of the modeling and to the large number of input variables (often numbering i...
Article
A model for the surface movement of radionuclides is presented. This model, which is referred to as the Pathways Model, was constructed in a NRC project to develop a methodology to assess the risk associated with the goelogic disposal of high-level radioactive waste. The methodology development involves work in two major areas: (a) models for physi...
Article
Full-text available
The Friedman (1937) test for the randomized complete block design is used to test the hypothesis of no treatment effect among k treatments with b blocks. Difficulty in determination of the size of the critical region for this hypothesis is com¬pounded by the facts that (1) the most recent extension of exact tables for the distribution of the test s...
Article
Full-text available
A method for inducing a desired rank correlation matrix on multivariate input vectors for simulation studies has recently been developed by Iman and Conover (SAND 80-0157). The primary intention of this procedure is to produce correlated input variables for use with computer models. Since this procedure is distribution free and allows the exact mar...
Conference Paper
Full-text available
Statistical methodology has been applied to the investigation of the regional hydrologic systems of a large area encompassing the Nevada Test Site (NTS) as a part of the overall evaluation of the NTS for deep geologic disposal of nuclear waste. Statistical techniques including Latin hypercube sampling were used to perform a sensitivity analysis on...
Article
Full-text available
The rank transform is a simple procedure which involves replacing the data with their corresponding ranks. The rank transform has previously been shown by the authors to be useful in hypothesis testing with respect to experimental designs. This study shows the results of using the rank transform in regression. Two sets of data given by Daniel and W...
Article
When a two-dimensional map contains points that appear to be scattered somewhat at random, a question that often arises is whether groups of points that appear to cluster are merely exhibiting ordinary behavior that one can expect with any random distribution of points, or whether the clusters are too pronounced to be attributable to chance alone....
Conference Paper
Full-text available
There are large uncertainties in the input waste depository. The aim of the sensitivity studies was to identify these uncertainties and to reduce the number of SWIFT transport calculations. The sensitivity analysis was applied to a hypothetical problem consisting of both a reference site containing a nuclear waste depository and a disruptive-event...
Article
Statistical techniques for sensitivity analysis of a complex model are presented. Included in these techniques are Latin hypercube sampling, partial rank correlation, rank regression, and predicted error sum of squares. The synthesis of these techniques was motivated by the need to analyze a model for the surface movement of radionuclides. The mode...
Article
Full-text available
Exact tables for Spearman’s rho are available only for n ≤ 16 and no ties in the data. Some accurate methods of approximating the distribution with no ties present have been used to obtain approximate tables for larger values of n. Often ties are present in the data so these tables are no longer exact. Also sometimes the tables are not conveniently...
Article
Full-text available
Since the squared ranks test was first proposed by Taha in 1964 it has been mentioned by several authors as a test that is easy to use, with good power in many situations. It is almost as easy to use as the Wilcoxon rank sum test, and has greater power when two populations differ in their scale parameters rather than in their location parameters. T...
Article
Graphs are presented to provide quick and accurate 5% and 1% tests of the equality of two correlation coefficients based on equal-size samples. For unequal samples, the harmonic mean 2n1n2/(n1 + n2) provides a good compromise sample size for using the graphs.
Article
The power of some rank tests, used for testing the hypothesis of shift, is found when the underlying distributions contain outliers. The outliers are assumed to occur as the result of mixing two normal distributions with common variance. A small sample case shows how the scores for the rank tests are found and the exact power is computed for each o...
Conference Paper
There are several rank tests for analyzing certain experimental designs. This study compares the robustness and power of several of these, including the rank transform, aligned ranks, and joint rank sum procedures. The experimental designs studied include the two-way layout with n = 2 and n = 5 observations per cell, the two-way layout with unequal...
Article
An approximation to the exact distribution of the Wilcoxon rank sum test (Mann-Whitney U-test) and the Siegel-Tukey test based on a linear combination of the two-sample t-test applied to ranks and the normal approximation is compared with the usual normal approximation. The normal approximation results in a conservative test in the tails while the...
Article
Full-text available
Several approximations to the exact distribution of the Kruskal-Wallis test' statistic presently exist. There approximations can roughly be grouped into two classes: (i) computationally difficult with good accuracy, and (ii) easy to compute but not as accurate as the first class. The purpose of this paper is to introduce two nev approximations (one...
Article
Full-text available
The analysis of data from eseperisental designs is often hampered by the lack of more than one procedure available for the analysis, especially when that procedure is based on assumptions which do not apply in the situation at hand. In this paper tvo classes of alternative procedures are discussed and compared, One is the aligned ranks procedure wh...
Article
Full-text available
Weekly weighings of the laboratory rats are required to determine the correct dosage for mixing in the food. This creates problems in that the food mixing must be done immediately after the weighings and staff are often heavily taxed to perform the task. A discounted least squares growth prediction model allows for prediction of weights a week ahea...

Network

Cited By