Science topic

Probabilistic Risk Analysis - Science topic

Explore the latest questions and answers in Probabilistic Risk Analysis, and find Probabilistic Risk Analysis experts.
Questions related to Probabilistic Risk Analysis
  • asked a question related to Probabilistic Risk Analysis
Question
2 answers
Can the DSHA spectrum be less than the PSHAs spectra?
I came across a report with the attached graph showing DSHA curves from each source considered in the PSHA calculations are less than the spectral values for the PSHA curves with 1000 and 2500 return year periods.
My understanding was that since DSHA is deterministic, it represents the worst-case scenario, and the spectral acceleration using this method will be greater than that of PSHA.
Relevant answer
Answer
Hi Arman
I confirm the previous answer by Zhongze Xu / Steve. It also depends on the way the DSHA spectrum is computed, i.e., how the aleatory variability σ of the GMPE is taken into account ( 0 σ, 1 σ, ...)
Best regards
py
  • asked a question related to Probabilistic Risk Analysis
Question
1 answer
Are disincentives the arche? How? Why?
My answer: Disincentives are highly probably the arche. How?: Any entity is more guided more by disincentives than by incentives. Why?: Disincentives may be the arche assuming the least but following the most evidence. Two other explanations for arches may be risks and or vibrations.
Source for vibrations:
Source for risks:
Relevant answer
Answer
Inhibitors generally have a negative effect, but few people take advantage of them for a new start
  • asked a question related to Probabilistic Risk Analysis
Question
4 answers
I want to use this attenuation relationship (suitable for central Italy) in probabilistic analysis therefore It must consider the soil type as a random variable with a certain probabilistic distribution.
Moreover, If there is any formula or methodology in which soil type from the epicenter to the target point is taken to account for obtaining an attenuation relationship, I would appreciate it if you introduce it.
Many thanks in advance.
Relevant answer
Answer
Many thanks, anyway Gianluca Regina
  • asked a question related to Probabilistic Risk Analysis
Question
6 answers
Could any expert try to examine our novel approach for multi-objective optimization?
The brand new approch was entitled "Probability - based multi - objective optimization for material selection", and published by Springer available at https://link.springer.com/book/9789811933509,
DOI: 10.1007/978-981-19-3351-6.
Relevant answer
Answer
  • asked a question related to Probabilistic Risk Analysis
Question
3 answers
I have 6 variables with Mean, Stdev, CoV, Min and Max. Find the attached excel file.
Relevant answer
You may use the approach for regression analyses. That should work.
  • asked a question related to Probabilistic Risk Analysis
Question
3 answers
I have a study that found an association between exposure to tricyclic antidepressants and the risk of preeclampsia. The number of women who were exposed and had an outcome (i.e. preeclampsia) was small: 210 exposed women and 10 of them developed late-onset preeclampsia. Generalized Linear Models with binomial distribution and log function was used to calculate the relative risk (SPSS software). The reviewer asked us to "report the model goodness of fit criteria (to ensure correct specification of the model)".
How should I reply him? Because our study is an exploratory study that suggested an association. We are not building any model or predicting anything. Besides, the number of exposed cases were too small to predict anything. Thank you so much.
Relevant answer
Answer
Thank you very much for the advice. I am a pharmacist and am not very good at statistics. Therefore I used SPSS to handle. Please show me steps/guide materials to do it? Thank you :)
  • asked a question related to Probabilistic Risk Analysis
Question
3 answers
Dear researchers
The provisions of ASCE 7-10 states that New Next Generation Attenuation Relationships (NNGAR) has been used in the process of Probabilistic Seismic Hazard Analysis (PSHA) to prepare seismic hazard maps provided in United State Geologic Survey (USGS).
Now, I want to know what is new next generation attenuation relationships and how are they different from other typical attenuation relationships such as Campbell, Douglas, Godrati, BJF, etc?
Relevant answer
Answer
Hello Majid
Just a few words to answer you, although I was not part of the NGA / NGA2 projects. Essentially, from a methodological viewpoint, these NGA GMPE are not different from the previous ones you quote. They may however considered are significant improvements for several reasons :
- very careful gathering of a common set of high quality strong motion data from various parts of the world, and covering an as brand as possible range of magnitude and distances, together with the corresponding Metadata regarding the source characteristics (rupture extension, fault orientation, moment magnitude, etc.), the propagation characteristics (different source-receiver distance metrics, potentiality of source- receiver configurations), and the site characteristics (the poorest though in my opinion, with only VS30, and most often only inferred and not measured - but with a clear indication of the origin of VS30 value). Personally I do no trust that much the Z1 / Z2.5 values (depth at which the S-wave velocity exceeds 1 or 2.5 km/s).
- derivation of a parallel set of numerical simulation results (with 1D codes for horizontally layered media) in order to constrain the non-linear part of the site response
- on this common data set, several groups of authors have made their own section, and developed their own a priori models for the source, path and site effects, with for some of them very sophisticated models to (try to) account for peculiar effects (directivity, rupture mechanism, NL site response, + what is ambiguously called "basin effect" corresponding indeed to deep deposits effects, etc.)
- a very careful estimation of the aleatory variability and its systematic separation in two terms ( within and between-event), with or without additional dependence on magnitude / distance / site conditions.
These NGA were developed over several years with extensive discussions between many participants, and can thus be considered as being the best State-of-the-art at the time they were developed.
Hoping to have brought useful answers, and hoping also NGA/NGA2 authors will complement and correct me !
pyb
  • asked a question related to Probabilistic Risk Analysis
Question
8 answers
When calculating a budget or a risk reserve, a simulation or estimation is performed.
Sometimes the Monte Carlo simulation is used.
It seems that each administration and each company uses different confidence percentiles when summarising the simulations in order to take a final decision.
Commonly, 70%, 75% or 80% percentiles are used. The American administration uses 60% for civil works projects...
My doubt is, is there any recommendation or usual approach to choose a percentile?
Is there any standard or normalized confidence percentile to use?
I expected to find such an answer in the AACE International or International Cost Estimating and Analysis Association, but I did not.
Thank you for sharing your knowledge.
Relevant answer
Answer
I believe setting confidence intervals is more of an art than an exact science. It really depends on how important it is that something remains within the confidence interval, but there are no real quantitative standards.
If going over budget almost certainly means default you may want to base risk reserves on a high percentle. If default is not imminent, like with governmental institutions, the percentile may be set lower.
  • asked a question related to Probabilistic Risk Analysis
Question
6 answers
I am interested in doing a Quantitative Risk Assessment for a building. But the historical data of past fire occurrences for that building is not available? What value should I take? I am unable to get it from literature also. Please suggest.
Relevant answer
Answer
If the building doesn't have an historical data of the latest fire scenarios, you should consider same kind of building statistics. FRA Analysis from SFPE could help a lot.
  • asked a question related to Probabilistic Risk Analysis
Question
4 answers
Hello,
I am a Non-Math major person, and I need to prove mathematically along with some results about how the probability of detecting errors will improve with the scanning frequency. Can anyone please help or share some literature about this.
For better understanding:
I want to do something similar to how does scanning a bar code ( probability of reading a bar code on the first attempt and how it will get better with the frequency) works. I mean, like bar code often misses the first time scanning but will gradually read the bar code when you increase the frequency of the scanning effort ( moving the same bar code again and again through that scanner). So I want to show ( in context of this example only) that the probability of true positive and true negative in the first effort, and how to improve the probability with the more scanning efforts ( of the same product).
Thank you!!!
Relevant answer
Answer
It depends on what you are scanning. If you are attempting to use the central limit theorem then you must verify that the underlying hypotheses are indeed true if you want to prove something. Otherwise, you have to make assumptions and if too many than the proof is not very reliable.
There are other issues as well depending on what you are trying to measure. If it has some underlying geometric structure then you may have to prove a uniqueness of approximation theorem. Otherwise, your increased sampling may converge to one of many different things and replication may be a problem. For example if you take X-rays of increased frequency of some unknown object from a single point source, increasing frequency allows you to construct some approximate shape of the unknown object but what you construct could be one of infinitely many different shapes producing the same measurements. However, if you use two somewhat independent point sources AND you know the shape is convex then it is known that increasing the number of measurements for X-rays, at some uniform angular spread from each source, will lead to approximate shapes which have to converge to the true shape of the object.
The point is, your question is a bit too general. Need more specifics.
  • asked a question related to Probabilistic Risk Analysis
Question
5 answers
I wish to know whether Failure Modes and Effect Analysis (FMEA) is considered as probabilistic or deterministic method of risk assessment?
Relevant answer
Answer
FMEA itself is a deterministic way of classification of failures and consequences! however, if you put the amount of probability of any occurrence within the blocks in order to find the final reliability, it is considered as a probabilistic method.
  • asked a question related to Probabilistic Risk Analysis
Question
10 answers
Hello everyone,
this is the second draft of my Question, I'll keep refining it until it becomes readable, coherent and goes to the point. Thanks for the entries and the suggestions already offered. This is part of my Ph.D. studies, dealing with remote sensing techniques and numerical modelling of deforming slopes. The question popped out once I completed a run of simulations using a combination of 2D and 3D trajectory analysis software (Rocfall and Rockyfor3D, and I'm planning to add RocPro3D to the recipe as well).
In a Ritchie's video (from Ritchie 1963, see attached image for reference, I do actually love it), on the CD that comes with the book ROCKFALL, Characterization and Control (TRB), he says how angular momentum, and increased rotational velocity, is one of the most important factor controlling the run-out of falling blocks, if a rock stays close to the slope, and start to roll faster and faster, is very likely to end up further away from the bottom of the slope, even compared to other geometrical/physical properties. And he mentions also how falling rocks tend to rotate perpendicular to their major axis, which is a minor issue for equidimensional blocks (spheres, blocks) but it can be fundamental for elongated blocks (e.g. fragments of columnar basalt).
The real case scenario I'm testing the models with, is a relatively small rockfall. Its vertical drop is about 15 m in a blocky/columnar weathered granite, the transition zone is resting at approximatively 45 degrees, covered in medium sized blocks (10 cm to 1 m across section), the deposition zone is about 25 m away from the vertical wall, confined by a 3 m height crushed rock embankment. The energy line for this event is extremely high (around 80 degrees), because is constrained by the rocktrap. I'll add some maps, maybe some screenshots, to hide some sensible information.
In the simulations that I have run (in ecorisQ's Rockyfor3D) it looks like the column-like boulders (having a very evident major axis, the base is .4 m x .8 m, while the height is 1.8 m) travel farther than any other class of rocks (I got 3 classes, small spheres 50 cm in diameter, large cubes 1 m by side, and column-like), even the ones larger in dimension and volume/mass, but with all 3 axis of comparable length. You can observe the results in the maps attached to the question. Img02 has been computed with cubical blocks. Img03 with elongated block.
The value of the pixels farther away from the bottom of the slope, the ones that overtopped the rocktrap, upon investigation, in GIS, show a value of indicatively 0.05 (%). Following some consideration in the ecorisQ manual they should be considered outliers, and practically tolerable.
My question is: how do I have to interpret this effect? Is it due to the rigid body approach? If everything else stay the same, mass should be the primary factor for controlling the horizontal travel distance right? Why I do find smaller block travelling farther? It might be a negligible difference given the extremely low likelyhood for those blocks to get there, but does it tell me something I don't get about how the numerical model works?
Is there a way to visualise angular momentum/rotational velocity in that software? AND, most importantly, is the way the problem has been formulated valid?
I really appreciate any help and any idea you can share about it. I'm very appreciative of the time you will spend regarding my problem. I'll probably keep adding details as they are needed. Thanks again
Kind Regards,
Carlo Robiati, PhD student in Camborne School of Mines, UK
Relevant answer
Answer
as observed by Matthew:
"At 3:36 in the video is states "the only shape that has a marked effect upon the way a rock rolls is demonstrated by this elongated piece of columnar basalt. Its length gives it eccentric action." By this I think they mean it inhibits its ability to roll. If you have ever tried to roll a fireplace sized log down a natural slope, you will have observed the difficultly that is being referred to."
An elongated rock (or log) is unable to change direction as easily as the natural slope changes aspect which cause it to wobble or bounce and slow its momentum. Not sure how/if you can model that effect.
Elongated rocks also have more of a tendency to break apart as they travel down a slope.
It is fortunate that the rockfall trap is effective for each of your simulations and not dependent on this issue.
good luck
Dave
  • asked a question related to Probabilistic Risk Analysis
Question
5 answers
Excuse my questions presented as statements. I actually mean that I have an idea but want others' thoughts.
I have a strong argument that verifiability in science does not carry an "axiomatic" value in science, but that it is there to reduce uncertainty (equiv: increase certainty). When we extrapolate too far, we cannot be that certain of our theory. How do we reduce it? Observation.
Bedford and Cooke: "In practical scientific and engineering contexts, certainty is achieved through observation, and uncertainty is that which is removed by observation. Hence in these contexts uncertainty is concerned with the results of possible observations."
Agree? Comments?
Ref: Bedford, Tim; Cooke, Roger. Probabilistic Risk Analysis: Foundations and Methods (Page 19). Cambridge University Press. Kindle Edition.
Relevant answer
Answer
Your first statement is very insightful. Theoretical scientists like us always like to reduce the real world to a more manageable model which necessitates simplifying assumptions. So first verify your assumptions, but in the real world, as Einstein says - a mathematical model will always be uncertain! see https://www.linkedin.com/pulse/risk-assessment-21st-century-fit-purpose-david-slater/
  • asked a question related to Probabilistic Risk Analysis
Question
5 answers
Power and gas retailers, are exposed to a variety of risks when selling to domestic customers. Many of these risks arise from the fact that customers are offered a fixed price, while the retailer must purchase the gas and power to supply their customers from the wholesale markets. The risk associated with offering fixed price contracts is exacerbated by correlations between demand and market prices. For example, during a cold spell gas demand increases and wholesale prices tend to rise, whilst during milder weather demand falls and wholesale prices reduce.
Relevant answer
Answer
If you are interested, you can read "Managing Price Risk for an Oil and Gas Company". Let me know.
  • asked a question related to Probabilistic Risk Analysis
Question
8 answers
Various methods exist for quantitative risk analysis, such as Monte Carlo simulations, decision trees, sensitivity analyses, etc. Is there any reliable classification for such methods?
Relevant answer
Answer
Please let me know if the following reference and site are useful to you:
Quantifying risk:  measuring the invisible
Conference Paper Risk Management 10 October 2015
By Meyer, Werner G.
citation:  :
Meyer, W. G. (2015). Quantifying risk: measuring the invisible. Paper presented at PMI® Global Congress 2015—EMEA, London, England. Newtown Square, PA: Project Management Institute.
check the following site to see if the article is freely downloadable:
Dennis
Dennis Mazur
  • asked a question related to Probabilistic Risk Analysis
Question
2 answers
Two systems give uncorrerlated or less correlated outputs while their inputs show some correlated behavior. So how to transform and find some complex relationship between inputs-outputs which can possibly give good correlation and copula dependence for outputs.
Relevant answer
Answer
Not sure I have understood the question as it seems to self contradict (which may be more my lack of insight) so i will rephrase how I think you mean. 
You start with data D1 and D2 as inputs into systems S1 and S2 respectively and you obtain outputs O1 and O2 respectively.
D1 and D2 are highly correlated. O1 and O2 are not well correlated. So S1 and S2 are diverging in their behaviour.
You expect S1 and S2 to behave similarly, therefore you were expecting O1 and O2 to be highly correlated.
You want to know how to mathematically transform S1 and S2 so that the produce better agreement.
Is this correct?
If not apologies for misunderstanding.
If yes, is your D1 and D2 signal based or at least consist of ordered variables where neighbouring variables have a strong covariance?
  • asked a question related to Probabilistic Risk Analysis
Question
9 answers
I want to model a proportional variable bounded by [0,1]  (the % of land fertilized). A high percentage of the data contains 0s (60%), a smaller percentage contains 1s (10%), and all the rest falls in between.
I want to compare different models with each other to see their performance, however the model I am currently looking at is a zero-one inflated beta model. I am using the R package gamlss for this.
However, I am having some troubles with the quite technical documentation of the gamlss package and I don’t seem to find an answer to my questions below:
1)      model
The model below should model 3 submodels: one part that models the probability of having y=0 versus y>0 (nu.formula), one part that models the probability of having y=1 versus y<1 (tau.formula) and a final part that models all the values in between.
gam<-gamlss(proportion~x1+x2,nu.formula=~ x1+x2,tau.formula=~ x1+x2, family= BEINF, data=Alldata)
This is okay I think.
2)      prediction
I would like to know now what is the predicted probability of an observation to have y = 0 or y = 1. I predicted the probability of y = 0 with the code below, however I get values that go far beyond the [0-1] interval. Therefore, they cannot be probabilities since these have to be in the interval [0,1].
Alldata$fit_proportion_0<-predict(gam, what="nu", type='response')
summary(Alldata$fit_proportion_0)
Could somebody explain me how to obtain the correct probabilities because the code above does not seem to work. I think the answer to my problem can be find on section 10.8.2, page 215 of the following link (http://www.gamlss.org/wp-content/uploads/2013/01/book-2010-Athens1.pdf). I think it says that the predict function that I use gives another answer, that I have to use in a certain formula to find the real probabilities. But I am not sure how to make this work?
Relevant answer
Answer
Hi, I am not familiar with gamlss package nor beta inflated distribution but I think it should do the job. From your reference, p0 = nu /(1+nu+tau) & p1 = tau/(1+nu+tau).
In raw in the piece of code below, you will find the lines which should solve your problem.
Best regards,
Geoffrey
## Code for Janka V. ##
rm(list = ls())
require(gamlss)
set.seed(28122016)
rho = 0.2
N.obs = 2000
X1 = rnorm(N.obs,0,1)
X2 = rho *X1+sqrt(1-rho^2)*rnorm(N.obs,0,1)
Y = matrix(X1^2-X2^2, ncol = 1)
Y = apply(Y, MARGIN = 1, FUN = min, 1 )
Y = apply(matrix(Y, ncol = 1), MARGIN = 1, FUN = max, 0)
tab = as.data.frame(cbind(X1,X2,Y))
gam.regression <-gamlss(Y~X1+X2,nu.formula=~ X1+X2,tau.formula=~ X1+X2, family= BEINF, data=tab)
prob.of.0.raw <- predict(gam.regression, what="nu", type="response")
prob.of.1.raw <- predict(gam.regression, what = "tau", type = "response")
## These should be the lines you should be interested in
prob.of.0 = prob.of.0.raw/(1+prob.of.0.raw+prob.of.1.raw)
prob.of.1 = prob.of.1.raw/(1+prob.of.0.raw+prob.of.1.raw)
## One can observe than :
delta.0 = mean(prob.of.0)-length(which(Y==0))/N.obs
delta.1 = mean(prob.of.1)-length(which(Y==1))/N.obs
  • asked a question related to Probabilistic Risk Analysis
Question
5 answers
What if the Relative Risk value is zero? Does that indicate there is no association between the risk and the disease?
Relevant answer
Answer
Do you mean RR=0 or RR=1? If RR = 0 there aren't cases of disease among the exposed to the factor, so the factor can be considered as protective. If RR=1 the incidence among the exposed to the factor is equal to the incidence among the non-exposed, that is, the incidence of the disease in the study population does not change in the presence or absence of the factor; therefore, the factor is not associated with the disease.
  • asked a question related to Probabilistic Risk Analysis
Question
2 answers
Hello all, 
I will perform containment analyses through GOTHIC - 3D. 
I couldn't find enough resources. 
Any kind of publication/tutorial/notes/tips are greatly appreciated.
Best regards,
E.B.
Relevant answer
Answer
Hi Erol, I guess that you are looking for Analysis of Desing Basis accidents, in that case check out publlications from the UPM safety group.
In case you are looking for Severe accidents with Hydrogen, there is an excellent researcher that works with GOTHIC: Michelle Andreani from PSI, that has contributed the most to hydrogen distribution in large enclosures.
Contact me for more info.
  • asked a question related to Probabilistic Risk Analysis
Question
1 answer
In the reliability analysis of repairable and redundant safety systems someone needs to consider the effect of maintenance program. We are developing a Markov model for the ECCS system of a typical PWR reactor and for transition rates calculation we need the typical values of test interval and test duration for the ECCS system of a PWR nuclear reactor.
Relevant answer
Answer
In the USA, Regulatory Guides (RGs) 1.79 and 1.79.1 relate to the preoperational and startup testing of emergency core cooling systems (ECCS) for Pressurized Water Reactors (PWRs) and Boiling Water Reactors (BWRs). These regulatory guides identify the ECCS functions that are to be tested as those necessary to ensure the specified design functions of the ECCS are met during conditions of normal operation, anticipated operating occurrences and postulated accident conditions.
In Japan, METI (Ministry of Economy, Trade and Industry) periodic inspections are conducted every 13 months in accordance with the EUIL (Electric Utility Industry Law) for light water reactors. These inspections include ISI (in-service inspection), system performance tests, containment leakage rate tests, and overhauls of mechanical equipment. In addition, many voluntary maintenance activities, most of which are periodic overhauls for mechanical equipment, are planned and conducted by electric utilities during the plant refueling outages.
  • asked a question related to Probabilistic Risk Analysis
Question
7 answers
I have considered Hydraulic conductivity (Ks) of clayey soil as random variable with log normal distribution. I have got negative mean (lambda) after the determination of measures of variation. Logically, I should not have negative mean for physical parameter Ks.. Find the attached excel document.. Kindly provide solution as soon as possible..
Relevant answer
Answer
Yes, it is possible to have a negative value for lognormal mean. The main purpose of using a lognormal distribution for probabilistic analysis is to have only positive values assigned to the variables (engineering properties like hydraulic conductivity). The lognormal of the variables is normally distributed however, you would be interested in assigning the corresponding normally distributed value to the hydraulic conductivity, and in that case you will happen to raise the lognormal mean to the exponent function rendering positive values only (exponent function is always positive). Hope this helps.
  • asked a question related to Probabilistic Risk Analysis
Question
4 answers
I cannot understand notions like €, this probability and how to compute.
Relevant answer
Answer
check the attached file.
  • asked a question related to Probabilistic Risk Analysis
Question
3 answers
Kindly, How to use maximum likelihood method or any other method to parametrize a mathematical model to discuss risk analysis?
Relevant answer
Answer
If your (probabilistic) model governing the data creation is well-specified, then the method of Maximum Likelihood Estimation is literally finding those parameters that optimize the likelihood function given the observed data.
Under standard assumptions, the likelihood function is the product of the probability model n times (number of data observations). This function will have specific form in the parameters.
Suggest the following to get you started:
  • asked a question related to Probabilistic Risk Analysis
Question
1 answer
How the "scale of fluctuation" interprets the homogeneity in clay sample prepared by slurry consolidation method ?
If the profiling of the clay bed is checked by CPT then How the scale of fluctuation estimated from CTP data will help in understanding the homogeneity along the depth?  
  • asked a question related to Probabilistic Risk Analysis
Question
7 answers
hi dear researchers,
i need to do probalistic analysis, how i can do this analysis and plot their results?
pleaze send me useful sources.
regards
Relevant answer
Answer
'Probabilistic analysis' is a rather wide definition of what you want to do. Why do you need to do it? What is your goal? What do you want to analyze? If you can clarify that it will be easier to come up with suitable references, because there are many of them and there is not a cookbook for it which you can just apply (actually, just applying, without considering the specific issues of your problem is a dangerous business).
  • asked a question related to Probabilistic Risk Analysis
Question
6 answers
I am planning to conduct FMEA study for some of the existing equipment/device.This device has several models, most of them have almost same design features. I wish to find out the weak parts/sub component of these device type by conducting the study.
Since number of different models of this device are available in the market, it is not possible to conduct study on all the models.
Whether it would be appropriate to conduct FMEA study on specific model and conclude the results for the whole species of the devices?
Relevant answer
Answer
As long as the different models are similar in functions, it is more efficient to maintain a single FMEA to cover them. Within the single FMEA, it is possible to denote some failure modes as specific only to Model X.  Or you could add an extra column to input which are the applicable equipment for each line item.
  • asked a question related to Probabilistic Risk Analysis
Question
1 answer
Actually the demand in the internet sportsbook is growing, more websites represent more gamblers. At the same time more website named "Picks" are winning buyers, the term "pick" is used to refer for a professional sport advice you could use to bet in the sportbook.
With applied statistics in the European sports leagues which are corruption-free, could you predict the score with a high probability more accurate than the picks?
Relevant answer
Answer
Hi Alberto,
I wouldn't say that European sports leagues are corruption-free as there is evidence of match fixing in lower leagues let alone some major scandals in higher leagues (example Serie A a few years ago).
However there are a number of papers that develop models which beat the betting market odds. Some very good papers are in the following journals
  • The Journal of Gambling Business and Economics
  • The Journal of Prediction markets
  • International Journal of Forecasting
  • Journal of Sports Economics
  • Journal of Applied Economics
  • Journal of Quantitative Analysis in Sports
  • International Journal of Sports Finance
Regards,
Dominic
  • asked a question related to Probabilistic Risk Analysis
Question
5 answers
What are the uncertainties that we can assume in a probabilistic fatigue assessment of an existing steel bridge?
Relevant answer
Answer
Hi,
Based on the fatigue assessment approach, uncertainties can rise from various sources.
For eg: if you chose to use S-N approach for fatigue assessment then uncertainties can arise from:
1. S-N curve
2. Miner's Rule (Linear Damage Rule) 
to account for these uncertainties we use probabilistic approach.
  • asked a question related to Probabilistic Risk Analysis
Question
7 answers
What is the strategy to be used for quantification of human actions for Probabilistic Safety Assessment when no data is available? Is expert opinion using questionnaire (with or without Delphi method) is correct approach. Is there any other suggestions. I am looking for data for headings of Event Tree for a process.  
Relevant answer
Answer
The Delphi technique is a widely used and accepted method for gathering expert data, to combine opinions on expected development time about a particular technology, as well as to discover what is actually known or not known about a specific topic.
In the field of radiation therapy services, an approach similar to the Delphi methodology is proposed in [1, 2]. In these papers the failure modes and human errors are initially conceived by members of the working group on an individual and independent basis (i.e. in “blind” mode), then are collectively revised during a dedicated plenary session to reach general consensus.
However, in my opinion, the following problems are not solved:
- It is usually difficult for the experts to provide advice for example in the risk data such as the occurrence probability (i.e. to give a precise answer );
- In a comparative study performed by different groups working on the same topic but in different company contexts, the investigators found only few overlap of failure modes and human errors that were identified.
1. Ciocca M, Cantone M C, Veronese I, Cattani F, Pedroli G, Molinelli S, Vitolo V, Orecchia R 2012 Application of failure mode and effects analysis to intraoperative radiation therapy using mobile electron linear accelerators Int J Radiat Oncol Biol Phys 82 (2) 305-11.
2. Cantone M C, Ciocca M, Dionisi F, Fossati P, Lorentini S, Krengli M, Molinelli S, Orecchia R, Schwarz M, Veronese I, Vitolo V 2013 Application of failure mode and effects analysis to treatment planning in scanned proton beam radiotherapy Radiation Oncology 8:127.
  • asked a question related to Probabilistic Risk Analysis
Question
13 answers
Could you please help me with finding references regarding combination of probabilistic risk assessment studies with air pollution modelling?
For instance, probabilistic safety assessment conducted for ammonia pipeline combined with the results of dispersion modelling can be used to assess annual costs of losses caused by accident. I met such works regarding groundwater pollution (http://www.sciencedirect.com/science/article/pii/S0304389413008005), however not so much regarding air pollution, or maybe I am wrong?
Thank you in advance.
Relevant answer
Answer
dear Ivan i send you one of my paper about risk management. i think that you can find something of interesting
  • asked a question related to Probabilistic Risk Analysis
Question
13 answers
I used risk reduction analysis to detect differences between the answers of two groups to a questionnaire question. 
Relevant answer
Answer
Dieticians Dentist       DI percent DE percent                     ARR    RRR               
total=94       total=123
Chocolate   81            120          0.86            0.98              -11.39     -13.22
Yogurt        23            56            0.24            0.46              -21.06       -86.07
Fruits          67             60           0.71             0.49             22.50        31.56
Sweeties    93            123         0.99            1.00              -1.06         -1.08
First find RR, RR =0.88 for chocolates indicates dieticians are (1-0.88)% i.e 12 % percent less likely to consume chocolate compared to dentists.
The ARR is interpreted as the risk of developing the disease is increased by ARR for those individuals exposed to the risk factor. ARR of chocolate is -11.39%, thus if a person is  dietician, the  risk of chocolate consumption  reduced by 11.39% compared to dentists.
RRR measures what proportion of the risk in exposed persons is due to the exposure?
RRR of chocolates is -13.22 indicates that 13.22 % of the chocolate consumption among dietician may attributed to dieticians.
  • asked a question related to Probabilistic Risk Analysis
Question
5 answers
I have a basic doubt regarding reporting of odds ratio. We see that odds ratio can be expressed both in terms of risk (>1) or protection (<1) according to the reference group used. When the reference group is interchanged the odds ratio is changed from one (>1) to other (<1). My question is, is there any specific criteria or rule regarding reporting of odds ratio in terms of either risk or protection?
Relevant answer
Answer
Thank you @Derek. another issue i met was with power calculation. The power calculation requires the risk allele frequency. Sometimes it is said minor allele frequency. This is not an issue under normal conditions because, most cases the risk allele will be the polymorphic allele and not wild allele. But in my case its not same, so I am confused if i should use the risk allele (long allele) or the minor allele (wild allele).
  • asked a question related to Probabilistic Risk Analysis
Question
14 answers
The Monte Carlo method of data validation requires large data sets (random numbers) for starter.
Relevant answer
Answer
My colleague has provided a structured procedure for systematic sensitivity/uncertainty analysis in waste LCA. We now follow it in most of our LCA studies. I think it is applicable in other types of LCA as well.