Science topic

Model Checking - Science topic

Explore the latest questions and answers in Model Checking, and find Model Checking experts.
Questions related to Model Checking
  • asked a question related to Model Checking
Question
5 answers
I am thinking about using the MOLUSCE QGIS plugin to help me create a baseline map for 2030 based on land cover change across previous a given time series. I understand that spatial variables can be input in the modelling process. Most of the papers I have read use DEM or distance from roads when defining spatial variables. My question is, can I input a wider variety, i.e crop suitability index, climate, population. Also, if I do use a greater number of variables does the model check for collinearity? Or should I run a regression analysis separately?
Relevant answer
Answer
You should evaluate regression analysis separately and then use independent variables for land use change analysis.
Good luck!
  • asked a question related to Model Checking
Question
5 answers
I have data for SNP with genotypes TT, TC and CC. I want to check the effect of these individual genotypes on the disease outcome, which is iron deficiency. How can I apply dominant, recessive and additive binary logistic regression model to check which fits more in SPSS ??
Relevant answer
Answer
Do you analyses your data? If you interest, i will help you. please inform me by text Dr Thilakshi Abeywickrama
  • asked a question related to Model Checking
Question
2 answers
Hello everyone!
I’m trying to perform a model checking over my 4 computed demographic scenarios (SNP data), I already ran the 4 million of scenarios recommended for my data set and this is the first time a I’ve got this warning when I try to perform a model checking:
Something happened during the analysis posterior_prob_4M_scn4 : Program of thread 'posterior_prob_4M_scn4' exited (with return code 1) unsuccessfully.
scenario[rt.scenteste-1].nparam = 10
apres detphistarOK nphistarOK=16
nphistarOK=16 nstat=50
Not enough suitable particles (16)to perform model checking. Stopping computations.
Did anyone have deal with this warning? I’m including the image of my pre evaluation of priors. Thanks in advance
Relevant answer
Answer
Did you get a counter example?
  • asked a question related to Model Checking
Question
8 answers
I am currently working on a model checking a systematic literature review. please help in this case.
Relevant answer
Answer
Define clearly the definition of Model Checking and its related definitions. Because mode checking is suite of concepts including type of models e.g. automata models or petri-nets, etc. The verification logic languages of a model-checker. First, take some model checkers on list and then see what are the different parts: model, engine, verification language, set of verification properties. Study each model checker and its each part in detail. See what each has edge over other, what types of application each can model, compare their performance and applicability. Case studies are helpful to understand tools practically.
  • asked a question related to Model Checking
Question
4 answers
Relevant answer
Answer
We came up with a new method to model COVID-19 our preprint is available in medRxiv. It is very simple yet effective method. The Matlab code is also available at the end.
  • asked a question related to Model Checking
Question
11 answers
I have behavioral data (feeding latency) which is the dependent variable. There are 4 populations from which the behavioral data is collected. So population becomes a random effect. I have various environmental parameters like dissolved oxygen, water velocity, temperature, fish diversity index, habitat complexity etc. as the independent variables (continuous). I want to see which of these variables or combination of variables will have significant effect on the behavior.
Relevant answer
Answer
I agreed with A. U. Usman answer. But some other techniques like non-linear analysis, cluster analysis, factor analysis, etc. may be utilized in this regard
  • asked a question related to Model Checking
Question
2 answers
Extended Finite State Machine (EFSM) is a famous method for modeling systems and then generating test cases for the purpose of conformance testing. Automatic testing tools help a lot in generating test cases. Is there any tool to generate test cases from EFSM model?
Relevant answer
  • asked a question related to Model Checking
Question
3 answers
Dear Researchers
What is a DID model (difference in difference)? I want to apply the DID model to check the impact of the policy change on the exports of the different countries. Please can any one guide how to apply DID model.
Thanks and best regards
S. H. Irshad
Relevant answer
Answer
Difference in differences (DID or DD) is a statistical technique used in econometrics and quantitative research in the social sciences that attempts to mimic an experimental research design using observational study data, by studying the differential effect of a treatment on a 'treatment group' versus a 'control group' in a natural experiment. It calculates the effect of a treatment (i.e., an explanatory variable or an independent variable) on an outcome (i.e., a response variable or dependent variable) by comparing the average change over time in the outcome variable for the treatment group, compared to the average change over time for the control group. Although it is intended to mitigate the effects of extraneous factors and selection bias, depending on how the treatment group is chosen, this method may still be subject to certain biases (e.g., mean regression, reverse causality and omitted variable bias).
In contrast to a time-series estimate of the treatment effect on subjects (which analyzes differences over time) or a cross-section estimate of the treatment effect (which measures the difference between treatment and control groups), difference in differences uses panel data to measure the differences, between the treatment and control group, of the changes in the outcome variable that occur over time.
  • asked a question related to Model Checking
Question
3 answers
HI EVERYONE,
Actually i model prestressed concrete beam in ANSYS APDL V19. As i know that it's not possible to apply with GUI as that of old version ANSYS V12. So i given a "inistate" command like this to apply prestrain to link 180,
inistate,set,dtyp,epel
inistate,set,mat,3 (as mat 3 = properties of prestress strand)
inistate,defi,,,,,,,,,,,,0.001903, is this command is correct.
i given this command once i completed the concrete and steel modelling and then given boundary condition and then i apply theses command to apply prestrain to link 180, as prestress itself is the first load step in ANSYS without any external load.
I consider it as load step 1, after this when i run the model and check the camber due to prestress it wont shows any value in the contour diagram even not deflected upward.
once i applied the commands directly without selecting the prestress steel and then again tried by selecting the prestress steel, in both case it wont considered the prestrain value at all.
Does i need to do few more steps, if yes please mention.
can anyone please kindly suggest a clear procedure (stepwise) for applying prestrain to link 180.
Thank you,
Regards,
Pandimani.
Relevant answer
Answer
try with this commands:
ESEL,s,sec,,1 ! Select Link180 elements type (example Element Type ID = 1)
inistate,SET,CSYS,-2
inistate,SET,DTYPE,EPEL ! Pre-Strain
inistate,define,,,,,Strain_value ! Set value
  • asked a question related to Model Checking
Question
3 answers
I have been using the "exclude models" check box but there are still many structural sequences in the results. Is there a way to go about the BLAST search which will exclude these sequences? I've been using the nr database because some sequences I need don't show up in Ref Seq, so I'm limited to the nr database.
Relevant answer
Answer
If you look at the list of accessions names you get, PDB IDs have a length of six letters with an underscore in position 5 (XXXX_X, PDB-ID underscore chain ID, all other sequence IDs seem to be longer.
  • asked a question related to Model Checking
Question
5 answers
I did Negative Binomial Regression on Disease A, which is a count data. Due to over-dispersion Poisson regression was unacceptable.I created 4 models to check the effect of different drugs on Disease A.I put time as a continuous variable. Assuming that all assumptions were fulfilled and output is from SPSS. How to interpret the four models and the change in IRR with Time and Drug Variable?
Note: Data is Drugs prescribed in a population and Disease A in a population.
Relevant answer
Answer
Negative binomial and Poisson regression are two distinct but similar
To be able to interpret the coefficient for the logistic regression, it will use the ., does not change for any comparison between one level of the dependent variable ...
Negative binomial regression is for modeling count variables, usually use for the SPSS keyword with to indicate that the variable that follow is a continuous predictor.
the output tables, will interrupt the output to tables(Region,Gender,Residence, Age group) at a time ..
  • asked a question related to Model Checking
Question
3 answers
Hello
I need some articles in the field of "runtime verification in smart grid"
or field of "model checking in smart grid" .
Can you introduce me ?
Thanks
Relevant answer
Answer
Yehia Abd Alrahman I'm sorry to reply late
both side
Nazakat Ali thank you
  • asked a question related to Model Checking
Question
3 answers
I would like to test is a given variable is associated with both the dependent and the independent variables (and therefore a potential confounder) in a repeated-measures design.
My model has a continuous dependent variable and two independent variables: a within-subject factor with 6 levels and a between-subject factor with 2 levels. I also have a continuous independent variable that is statistically associated with my between-subject factor (tested using a 2 sample t-test).
How should I test the association between this continuous variable and the dependent variable?
(1) adding this variable to my original model and check the effect of this extra variable?
or
(2) fitting a new model with only my within-subject factor and this extra variable (i.e. without the between-subject factor)?
Relevant answer
Hi,
Counfounding adjustment by regression allows you to directly check for the effect of the potential confounder on the outcome, and indirectly check for the association between it and the exposure/intervention.
The potential confounder coefficient can be statistically significant or not. If not it is not relevant; if yes there are two options:
1. The coefficient of the main exposure is affected (as Esterman says for at least 10%, relative), and there is a confounding effect.
2. The coefficient of the main exposure is not affected (<10%, relative) and there is not a confounding effect.
Necessarily, If it is the first case then the main exposure and the potential confounder are associated.
If you have more than one parameter for your main variable (interaction with time or other variable) then you could use this logic on the marginal estimation of main exposure effect.
  • asked a question related to Model Checking
Question
1 answer
I require three researchers to work with me on this book chapter project. The proposal is already accepted.
Time of writing is one month.
if interested to participate in this publication please reach me out. I will take people with experience in research publication and knowledge of these disruptive innovations.
Book Chapter name:  
DISRUPTIVE TECHNOLOGIES: BLOCKCHAIN, BIG DATA, AND IoT.
Objectives
the specific objectives of this chapter are to:
i. Explore the potential application, opportunities, and challenges posed by blockchain technology on big data, and the Internet of Things.
ii. Propose a model for checking the perceived readiness and perceived ease of use of blockchain, big data and IoT technologies.
iii. Lay an understanding of how disruptive blockchain technologies will be too small and big firms (how to identify that a company or industry is going to be disrupted).
Desired skills and experience
Knowledge about;
  • BLOCKCHAIN
  • BIG DATA
  • IoT
About Me
I research about
  • ONLINE LEARNING TOOLS
  • BLOCKCHAIN
  • BIG DATA
  • IoT
  • ADOPTION OF ICT
Areas of Research
  • Data Analytics
  • Internet of Things
  • Networks & Internet Technologies
Relevant answer
Answer
okay will get in touch
  • asked a question related to Model Checking
Question
3 answers
  1. Artificial Intelligence's safety is important for us to build AI world, but feeling some confused about how to verified AI.
  2. And which differences between model checking , theorem proving in traditional formal method and the methods for Verified AI?
  3. For traditional cyber-physical system(CPS), which the new problems we should consider when CPS carries with neural network?
Relevant answer
Answer
Hi,
Artificial Intelligence's is important for us to checking model theorem in traditional methods.
Regards
  • asked a question related to Model Checking
Question
1 answer
hello!
I plan to establish an LCMV meningitis model and to check the protective effect of the adoptive transfer of some memory CD8 T-cells.
In the current protocols in immunology, 2001, M von Herrath and J L Whitton stated the use of 1 ml 27G syringes for that, but do I need stereotaxis?
  • asked a question related to Model Checking
Question
4 answers
Hi,
I have built a Gaussian Mixture model to check the likelihood of each point and then identify the outliers by plotting the log likelihood as in the graph attached. In this particular case, the points below -2 can be considered outlier by visualizing how far they exist from the rest of points. The challenge is that I have hundreds of such cases and would like to plot and check manually the outliers. Is there a statistical way to determine the cut-off point (like -2 in this case) so I can write a program to avoid manual work..
Relevant answer
Answer
It seems you want to arbitrarily label points with relatively low likelihood under your model as "outliers". I don't think there is any "objective" rule to justify an particular cut-off. As a heuristic you may specify what proportion of the values you can accept being labeled as "outliers" and chose the corresponding quantile. Or you can use Chebychev's inequality to control the upper limit of the probability to label a value, given the model.
  • asked a question related to Model Checking
Question
4 answers
Hi everyone, 
I am working on case control genetic data. I have data for SNP with genotypes AA, AG and GG. I want to check the effect of these individual genotypes on the disease outcome, which in my case is diabetes. Now I want to calculate unadjusted and adjusted (for age and gender) Odds ratio in SPSS. I calculated unadjusted Odds ratio by using multinomial regression (is this suitable for my data?). But do not know how to calculate Odds ratio after adjustment for age and geneder. 
Secondly, How can I  apply dominant, recessive and co-dominant model to check which fits more. 
I would be highly obliged if you provide me the flowchart of commands like 
Spss--> analyze--> regression--> .........
Thanks, 
Misbah
Relevant answer
Answer
Model selection requires some other techniques. Can you please explain techniques.
  • asked a question related to Model Checking
Question
4 answers
I am establishing a Tcell co-cultured 3D human cancer spheroid model. 1 hour after addition of HLA-matched Tcells to spheroids, I see that the Tcells infiltrate the spheroids. I used this model to check if immunotherapy using anti-PD1 inhibitor is a viable option to treat this cancer. On performing cyrotoxicity assay, I found that Tcells alone was killing the cancer cells and the addition of anti-PD1 checkpoint inhibitor actually suppressed the cytotoxicity. I wonder if the Tcells were compatible with the cancer cells even though they were HLA matched. what additional test should be done to confirm that. What could be the other possible reasons for the observed cytotoxicity?
Relevant answer
Answer
Thanks for ur answer and suggestion. Immunology is relatively new for me. I am not working with cell lines. My 3D cultures are from patient derived xenografts (PDX). The Tcells are not from the same donor as the PDX, but they are hla matched. There are only CD8+ and CD4+ Tcells and no DCs or NK cells in the mix.
  • asked a question related to Model Checking
Question
36 answers
I have 5 time points: 1hr, 2hr ,3hr, 5hr, 8hr; and I recorded the the percentage of a specific yeast in a population by weight at each time point. and  I repeated this experiment 1000 times. Therefore, I have five points for my x value, and 1000 y for each x. I want to apply the Gompertz model to my data. 
My question is:  1) Is the five x points is enough for building this model?
                           2)  Is there any R code for this model and checking the adequacy of  the model. 
Thank you in advance. 
Relevant answer
Answer
Hi Jack,
Here is an example of the R-code I used to fit Gompertz model to my data (number of cases of Ebola in Liberia):
#***************** WORK ON GOMPERTZ MODEL *******************#
library(minpack.lm)
alpha = 9526
beta = 9.1618
k = 0.0028
nls.fit.gompertz <- nlsLM(data2$cases ~ alpha*exp(-beta*exp(-k*data2$days)),
data = data2, start = list(alpha = alpha, beta = beta, k = k),
control = list(maxiter = 500))
coef(nls.fit.gompertz)
aic.gompertz <- -2*logLik(nls.fit.gompertz) + 2*4
aic.gompertz # 1149.448
# alpha = 9437, beta = 59.24, k = 0.0219
# Now fit Geompertz model - Use data2
library(growthmodels)
use.it.gompertz2 <- gompertz(data2$days,
alpha = 9437, beta = 59.24, k = 0.0219)
use.it.gompertz2
length(use.it.gompertz2)
# Predict
days.predict2 <- c(data2$days, c(357+31, 357+61, 357 + 92))
predict.it.gompertz2 <- gompertz(days.predict2,
alpha = 9437, beta = 59.24, k = 0.0219)
predict.it.gompertz2
length(predict.it.gompertz2)
predict.it.gompertz2[c(83:86)]
result.gompertz2 <- data.frame(days.predict2, predict.it.gompertz2)
result.gompertz2
result.gompertz2.compare <- cbind(result.gompertz2[1:83,], data2$cases)
names(result.gompertz2.compare) = c("days", "predict", "cases")
# Generally underestimate the number of cases
# Plot
ylim = range(cases)
plot(days, cases, col = "blue", xlab = " ",
ylab = " ", main = "Predicted and Observed cases over time",
xlim = range(days.predict2), ylim = ylim)
par(new = TRUE)
plot(days.predict2, predict.it.gompertz2, col = "red", xlab = " ", type = "l",
ylab = " ", main = "Predicted and Observed cases over time",
xlim = range(days.predict2), ylim = ylim, sub = "Gompertz Model")
Hope it help.
  • asked a question related to Model Checking
Question
4 answers
We are using GARCH model for checking the volatility of time series data. How can we check the Economic significance of the model ? Especially the extent to which the independent variable contribute to volatility of the model (ht) in each period.
Relevant answer
Answer
Dear Abdul Rishad,
ARCH and GARCH models have become important tools in the analysis of time series data, particularly in financial applications. These models are especially useful when the goal of the study is to analyze and forecast volatility. Attached article may help you to understand more about the economic significance of GARCH model.
Best Wishes,
  • asked a question related to Model Checking
Question
5 answers
In the modeling space, various level of abstractions have been introduced to facilitate readability and improve correctness (by minimizing details). For example, a flat transition system may be encoded hierarchically. Although these systems differ structurally, it is expected that they behave similarly.
To certify equivalence of these models, a formal (or mathematically sound) method is required to prove this. While theorem proving may be applicable, though difficult to apply, a fully automated approach (i.e., model checking) should surface to realize our goal.
Are you aware of any model checking approach that can answer the equivalence question by a button press?
Relevant answer
Answer
usually you have to define an equivalence relation that specifies what does it mean for two models to be equivalent. To do so, you have to map both models into the same semantic domain, say an LTS or whatever suits you. Then you have to formalize, as I mentioned before, what it means for two models to be behaviorally equivalent. This thing is application-dependent. In concurrency theory, you usually  define a notion of bisimulation which generally states that two models are equivalent if they can mimic each other in terms of observations. The point here is that you look at your models as black boxes and if one of them can do a move, the other one should be able to mimic this move and vise versa. 
The point that I want to stress is that you have to somehow translate your models into a common framework, whether it is another language with a well-understood semantics or into a mathematical domain. After that you define your notion of equivalence and there are plenty of algorithms to compute equivalence relations. For instance if your semantic domain is an LTS and you have a notion of bisimulation then you can use that standard Paige-Tarjan algorithm for partition refinement. 
As for a tool that can compute bisimulation with a click, take a look at TAPAs
It is only for process algebra. 
Note: if your models are sequential in nature, then the equivalence checking would be much easier as you can rely on denotational semantics. For operational models where concurrency is involved, the situation is more complicated with respect to sequential models. In sequential models, a program generally can be defined as function with input and output and you don't care what happens in between (assuming there are no side-effects), but in concurrency, program executions interleave and you have to take care about what happens in between.
Hope that would be of help to you
Yehia
P.S. this is called equivalence checking not model checking.
  • asked a question related to Model Checking
Question
3 answers
Will model checking be a good choice for finding Trojans?
Relevant answer
Answer
Hi, I would say model checking for hardware is as powerful as for software systems. But in the end, it depends on how efficient and fine grained your models. 
  • asked a question related to Model Checking
Question
8 answers
Can this data be generated through any programming language or software?
Relevant answer
Answer
Williams gives a good overview on the topic in
Williams, H. P. (2013). Model building in mathematical programming. John Wiley & Sons.
The PDF is available on ResearchGate.
  • asked a question related to Model Checking
Question
14 answers
One of the reviewers of our paper is of the view that: "Counting number of non-deterministic transitions doesn't give you any quantitative information about time or throughput."  whereas PRISM has a whole MDP benchmark that deals with (network) performance mostly: http://www.prismmodelchecker.org/benchmarks/props-mdp.php
Do you agree/disagree with his comment? Particularly, a good research paper/book for/against this comment will be highly appreciated. 
Relevant answer
Answer
Great. Thanks a lot for your detailed answers and resolving the confusion. 
  • asked a question related to Model Checking
Question
14 answers
Hello there,
I am working on a flow simulation in a rotary type of HCCI engine. I was using Species model to check the mixing of air-fuel without the reactions. The next step is combustion modeling. I am looking to work with the detailed chemistry of the fuel n-heptane. I found a reduced mechanism for the n-heptame in CHEMKIN format. I dont know how to start with the combustion problem using CHEMKIN mechanism in ANSYS FLUENT. 
Please let me know how to start with or please provide me some stepwise tutorials for the combustion which uses CHEMKIN in FLUENT.
Thanks a lot in advance,
Parth 
Relevant answer
Answer
Yes I do have CHEMKIN mechanism but I am looking for some tutorial which shows the preprocessing for the combustion and what are the compatible solution techniques while importing the CHEMKIN mechanism. 
Thanks though for your response.
Parth
  • asked a question related to Model Checking
Question
9 answers
I want to develop speed of vehicle using multiple linear regression model. For instance, I have 457 of sampling. I used 300 of sampling for model development and keep 157 sampling to check model accuracy. Based on proposed model, I compute predicted speed. Then which test can I perform to compare predicted and observed speed?
Relevant answer
Answer
You could use an indicator like RMSE to evaluate accuracy.
  • asked a question related to Model Checking
Question
4 answers
While experimenting with nuXmv and NuSMV model checkers, I observed that input variables contribute no state to the entire state-space of any given model.
The user manuals of these tools clearly mentioned syntactic differences between these kinds of variable. For example, the "IVAR" introduces the input variables paragraph; but "VAR" introduces state variables.
Is there any science behind this behavior of input variables? In particular, how do these model checkers handle input variables and state variables? Any reference on this observation will be appreciated.
Relevant answer
Answer
Hi Yasir,
Here are my concerns:
MODULE main
-- this is an IVAR paragraph
  IVAR
   v1 : 0..20;
   v2 : 0..20;
-- this is a VAR paragrapgh
  VAR
    v3 : 0..100;
ASSIGN
  init(v3) := 0;
  next(v3) := case
    v2 + v1 = 0 : 10;
    TRUE : v2 + v1;
  esac;
LTLSPEC !F(v3 = 0)
But by simply changing this to VAR, we have x = 18081, y = 44541
MODULE main
-- this is a VAR paragraph
  VAR
   v1 : 0..20;
   v2 : 0..20;
-- this is a VAR paragraph
  VAR
    v3 : 0..100;
ASSIGN
  init(v3) := 0;
  next(v3) := case
    v2 + v1 = 0 : 10;
    TRUE : v2 + v1;
  esac;
LTLSPEC !F(v3 = 0)
The difference reflects from the values of y. So my question is what makes IVAR behaves differently from VAR.
Try to run the my code and compare the complexities. You should be able to know the difference. I would have modified you code but the restrictions on IVAR will not make it compile.
  • asked a question related to Model Checking
Question
1 answer
I am conducting some experiments with a focus on the number of BDD nodes required for the analysis of a given property on various NuSMV or nuXmv models.
It will be appreciated if anyone can provide a guide on the procedure or recommend resources to assist me in conducting the experiments.
Thanks.
  • asked a question related to Model Checking
Question
10 answers
I want to fit my data with the Gompertz curve. So I should find the best free parameters in Gompertz function that leads to low bias and variance. What’s your suggestion?
Relevant answer
Answer
Negin -
I think that whenever you use a "training" set of data to estimate coefficients for your model, and then a different set of data to check out performance, you are more or less looking at all the kinds of error that could come up, though measurement error does make this less than a straightforward comparison of predicted vs "true" y values.  Even taking out one observation at a time to see how a model based on the others would have estimated (predicted) for it, can be helpful.  There are a number of books and papers on model validation, and model selection for that matter, that you could look into, among the ones that might help would be the Hastie, etal and Fox books I mentioned.
I have a paper on ResearchGate where I did a general validation of a model across a number of categories for official statistics, the third in a series of papers ending in 2001, but I doubt it is what you need.  I suggest you look at textbook examples of validation.
Best wishes - Jim 
PS -  This means that you could use a number of methods, but the key is "how" you use them.  The "training" set of data has to be used to form your model that is then applied to other data. Often this is referred to as "splitting" the data.  Compare what you have predicted, pretending you did not have those y data, to what you did actually have. You can then choose methods from those noted above, or perhaps others. 
  • asked a question related to Model Checking
Question
8 answers
I want to fit my data (50 samples so that each of them contains 100 data points) with Gompertz curve. After that, I want to check whether the fitting is appropriate. Here, low bias and variance is meant by appropriate fitting. I’m looking for some implementation for this purpose. I found the distfit function in Matlab for fitting, but it doesn’t support Gompertz distribution. So, what is your suggestion?
Relevant answer
Answer
Dear Negin,
I'm not sure if I understand you correctly. But I would say that you just want to fit a Gompertz distribution to your data. I don't know how this works in Matlab, but in R you could use e.g. the flexsurv package, see first link. Also the vglm() function from the VGAM package seems to support the Gompertz distribution. Or you use the makeham() function, which is also contained in the VGAM pacakge, see second link.
Best, Stephan
  • asked a question related to Model Checking
Question
4 answers
all share price index, turnover ratio and bank credit have been used for financial development and real gdp growth rate for economic growth.
is it appropriate to use vecm model to check relationship with stock market and economic growth on the above mentioned proxy variables?
Relevant answer
Answer
To include additional variables  such as saving and investment  in your  model does not ignore what you investigate. Apart from  linking the the stock market and economic growth,the two variables account  for misspecification biases in the model,you better include to have appropriate model
  • asked a question related to Model Checking
Question
5 answers
By using PAT model checker, we can create a mAskodel and simulate it, so for verifying a CTL specification for model, we should translate this model to SMV code that verify it in NuSMV model checker.
Relevant answer
Answer
Hello,
We developed a JAVA transformation tool that accepts ,as input, any model and translates it into NuSMV code. Please check the following link.
  • asked a question related to Model Checking
Question
9 answers
It will of great help if any one can provide me the data sets so that i can use them in my model to check the significance if the work.
I hope with out any hesitation researchers can forward their  support.
with Regards
Syed Mohsin Saif
Relevant answer
Answer
Dear Jan Friso Groote
Thanks for your suggestions
with Regards
Syed Mohsin Saif
  • asked a question related to Model Checking
Question
3 answers
Given a set of minimal T-invariants of a Petri net, is it possible to reduce the state space of the net system?
Relevant answer
Answer
You might be interested in this paper (we used T-invariants as a memory optimization heuristics during state exploration):
R. Carvajal-Schiaffino, G. Delzanno and G. Chiola.
Combining Structural and Enumerative Techniques for the Validation of Bounded Petri Nets.
In T. Margaria and W. Yi, editors. Tools and Algorithms for the Construction and Analysis of Systems. TACAS 2001.
Lecture Notes in Computer Science 2031, Springer. 2001.
  • asked a question related to Model Checking
Question
6 answers
I have a model in NuSMV and I want to verify that is it giving me the correct result for my input. Generally, we hard code the value in NuSMV.
Relevant answer
Answer
Try to run the NuSMV interactively, using for example simulation commands with -i option, then you can choose one of the possible next states (values of variables).
  • asked a question related to Model Checking
Question
4 answers
I am reading logical languages to write safety/ liveness specification for Model Checking. I have read these things in mathematical form and understood also but want to know these differences with real life examples.
Relevant answer
Answer
Thanks Zhang! It was interesting and useful discussion.
  • asked a question related to Model Checking
Question
15 answers
UML is a very popular modelling technique and tool. Formal specifications are based on mathematics. Formal verification is done by model checking, proving axioms and by algebraic based methods. How can we integrate formal verification in UML specifications? Is it feasible to make this integration?
Relevant answer
Answer
check out our tool for direct UML verification without any translations to any other formal languages.
We are releasing the new version soon.
  • asked a question related to Model Checking
Question
28 answers
I am looking for tool chains (even a model based engineering methodology) to enable formal verification of ERTMS (railway signalling) systems. Something along the lines of how Prover works with Simulink and SCADE, but preferably a Symbolic tool like NuSMV. or other industrially viable tools with some way to have a formal verification.
Relevant answer
Answer
Scade provides safe state machines to describe state-based behavoir. But you're right everything is finally mapped to an (at least locally) synchronous execution model,
However, when using UPPAAL or another model checker you usually do the design in another framework and build a verification model to prove your properties. That's is often non-trivial since the design model has to be abstracted effectively (e.g. UPPAAL comes to its limits easily if the system contains a lot of variables and clocks). Second, the abstraction-refinement relation has to be proven property preserving for the relevant properties, otherwise verifying something for the verification model may not tell anything about the design model.
So my first question would be whether you already have some design models for your systems, or do you constsruct your own verification models only?
  • asked a question related to Model Checking
Question
7 answers
I would like to know what areas they are applying SMT to and to connect.
Relevant answer
Answer
My group has extensively used it for analyzing hybrid discrete-continuous systems, including a stochastic extension of SMT applicable to probabilistic variants thereof. The workhorses were SMT solvers over arithmetic theories, including ODEs. Details can be found on the web pages of the AVACS transregional research center (www.avacs.org), where you will also find loads of other applications of SMT. The latter include their use as subordinate solvers for discharging theory-related proof obligations in the yet richer framework SUP(T) of superposition modulo theory (look for authors Weidenbach et al.) and for redundancy detection and simplification in arithmetic decision diagrams (look for Scholl et al.).