Conference Paper

# Robustness-guided temporal logic testing and verification for Stochastic Cyber-Physical Systems

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

## No full-text available

... Now, we define the QTS max distance, a measure of similarity of two given QTSs. (1) , L (1) ) and Q (2) = (A (2) , a (2) 0 , τ (2) , Σ, [.] (2) , L (2) ) is defined as: ...
... Now, we define the QTS max distance, a measure of similarity of two given QTSs. (1) , L (1) ) and Q (2) = (A (2) , a (2) 0 , τ (2) , Σ, [.] (2) , L (2) ) is defined as: ...
... The temporal version of problem 2 has been solved in [1,26].We study the spatio-temporal case in this paper. Problem 2 can be formulated as an optimization problem: ...
Conference Paper
Full-text available
Networked dynamical systems are increasingly used as models for a variety of processes ranging from robotic teams to collections of genetically engineered living cells. As the complexity of these systems increases, so does the range of emergent properties that they exhibit. In this work, we define a new logic called Spatial-Temporal Logic (SpaTeL) that is a unification of signal temporal logic (STL) and tree spatial superposition logic (TSSL). SpaTeL is capable of describing high-level spatial patterns that change over time, e.g., "Power consumption in the northwest quadrant of the city drops below 100 megawatts if the power consumption in the southwest quadrant remains above 200 megawatts for two hours." We present a statistical model checking procedure that evaluates the probability with which a networked system satisfies a SpaTeL formula. We also develop a synthesis procedure that determines system parameters maximizing the average degree of satisfaction, a continuous measure that quantifies how strongly a system execution satisfies a given formula. We demonstrate our algorithms on two systems: a biochemical reaction-diffusion system and a demand-side management system for a smart neighborhood.
... We identify the policies for RVs through requirements engineering [20] and represent the policies with formal logic that enables formal reasoning about them. The policies are expressed with Metric Temporal Logic (MTL) [1], [42]. In contrast to Linear Temporal Logic (LTL) [50] and Computation Tree Logic (CTL) [26] that enable reasoning over occurrence and event ordering, MTL extends LTL's modalities with timing constraints, which is more amenable to represent semantically rich temporal and causal relations among system states of RVs. ...
... Mapping Each Policy onto Terms ( 1 ). A policy is composed of the RV's physical states, configuration parameters, and environmental factors. ...
... The static analysis is used for identifying the terms related to each configuration parameter (Input P , 2 and 2a in Figure 3). We use two complementary approaches to identify the related terms: (1) conducting static analysis at the LLVM intermediate representation (IR) level, and (2) parsing vehicle manuals. ...
Conference Paper
Full-text available
... For example, a formula ∀v.∃v . [0,2] v > 1 =⇒ [1,2] v > 2 is allowed but a formula ∀v.∃v . [0,2] (v > 1 =⇒ [1,2] v > 2) is not allowed. ...
... [0,2] v > 1 =⇒ [1,2] v > 2 is allowed but a formula ∀v.∃v . [0,2] (v > 1 =⇒ [1,2] v > 2) is not allowed. Also, t-HyperSTL restricts the until operator to be speci ed over an individual trace, e.g., t-HyperSTL does not allow the formula ∀v.∃v . ...
... For the case that both execution traces of a system, w and w , belong to some in nite sets, and if we have a veri cation oracle to address the last quanti er (e.g., by conservatively estimating the set of possible system behaviors, under certain conditions), we can either falsify or verify the system. Given a set of initial states, a veri cation oracle can be a method that mathematically overapproximates the reachable set of the system or a simulation-based technique [1,21] that may verify the system with nite simulations. ...
Conference Paper
Full-text available
A hyperproperty is a property that requires two or more execution traces to check. This is in contrast to properties expressed using temporal logics such as LTL, MTL and STL, which can be checked over individual traces. Hyperproperties are important as they are used to specify critical system performance objectives, such as those related to security, stochastic (or average) performance, and relationships between behaviors. We present the first study of hyperproperties of cyber-physical systems (CPSs). We introduce a new formalism for specifying a class of hyperproperties defined over real-valued signals, called HyperSTL. The proposed logic extends signal temporal logic (STL) by adding existential and universal trace quantifiers into STL's syntax to relate multiple execution traces. Several instances of hyperproperties of CPSs including stability, security, and safety are studied and expressed in terms of HyperSTL formulae. Furthermore, we propose a testing technique that allows us to check or falsify hyperproperties of CPS models. We present a discussion on the feasibility of falsifying or verifying various classes of hyperproperties for CPSs. We extend the quantitative semantics of STL to HyperSTL and show its utility in formulating algorithms for falsification of HyperSTL specifications. We demonstrate how we can specify and falsify HyperSTL properties for two case studies involving automotive control systems.
... A large positive value indicates that the formula φ is robustly satisfied by the trace τ at time t, a positive value close to zero suggests that τ (t) satisfies φ but it is close to violating φ, and a negative value indicates that the formula φ is violated by τ (t). This has motivated its use in learning STL formulae for specification mining [7,13,20,26], diagnosis [27], falsification [1,2,6], and system synthesis [4,11,38]. ...
... Existing techniques for learning STL formulas can be broadly classified into active and passive methods. Active STL learning methods rely on availability of a simulation model on which candidate temporal properties can be falsified [1,3,6,41]. This generates counterexamples. ...
... The presented approach is implemented in a publicly available tool: TeLEx. 1 We evaluated the effectiveness of TeLEx on a number of synthetic and real case-studies. All experiments were conducted on a quad core Intel Core i5-2450M CPU @ 2.50GHz with 3MB cache per core and 4 GB RAM. ...
Article
Full-text available
We propose a novel passive learning approach, TeLex, to infer signal temporal logic (STL) formulas that characterize the behavior of a dynamical system using only observed signal traces of the system. First, we present a template-driven learning approach that requires two inputs: a set of observed traces and a template STL formula. The unknown parameters in the template can include time-bounds of the temporal operators, as well as the thresholds in the inequality predicates. TeLEx finds the value of the unknown parameters such that the synthesized STL property is satisfied by all the provided traces and it is tight. This requirement of tightness is essential to generating interesting properties when only positive examples are provided and there is no option to actively query the dynamical system to discover the boundaries of legal behavior. We propose a novel quantitative semantics for satisfaction of STL properties which enables TeLEx to learn tight STL properties without multidimensional optimization. The proposed new metric is also smooth. This is critical to enable the use of gradient-based numerical optimization engines and it produces a 30x to 100x speed-up with respect to the state-of-art gradient-free optimization. Second, we present a novel technique for automatically learning the structure of the STL formula by incrementally constructing more complex formula guided by the robustness metric of subformula. We demonstrate the effectiveness of the overall approach for learning STL formulas from only positive examples on a set of synthetic and real-world benchmarks.
... OMA DiagMon Trap Events specification defines a number of standardized traps [14]. Geographic trap goes to active when a device enters into a specific geographic area. ...
... The behavior of the QoS Management Agent is described by temporal logic [14]. We use a minimal set of standard notations G for always, U for until, and N for next. ...
Article
Full-text available
Machine-to-Machine communications (M2M) allow connected devices to exchange data and perform actions without human intervention. Different M2M applications have different quality of service (QoS) requirements. The dynamic QoS control is critical for applications like video surveillance, transportation services, and industrial control. The provided QoS depends on device connectivity. Connectivity management provides centralized management of M2M devices and connections. The paper studies model aspects of QoS management for connected M2M devices by means of connectivity management. Models for connectivity management are suggested, formally described and verified. The capabilities for device connectivity management are used to model the behavior of autonomous agent for dynamic QoS control.
... This formulation may become cumbersome for large delay constants and by extending the syntax we can write this formula as 0(p → 1 [2,3] q) with 1 [a,b] p being satisfied at t if p is satisfied at some t ∈ [t + a, t + b]. In discrete time, this can be viewed as a syntactic sugar, but in dense time where next is anyway meaningless, this construct allows events to occur anywhere in an interval, not necessarily at sampling points or clock ticks. ...
... Therein, they show the presence of unexpected behaviours in an auto- matic transmission model that were not revealed by previous testing approaches. Falsification methods for stochastic systems were applied to stochastic models of automotive systems in [3]. The presented framework for robustness guided falsification in this section is also used as an intermediate step in specification mining methods [129,74]. ...
Chapter
The term Cyber-Physical Systems (CPS) typically refers to engineered, physical and biological systems monitored and/or controlled by an embedded computational core. The behaviour of a CPS over time is generally characterised by the evolution of physical quantities, and discrete software and hardware states. In general, these can be mathematically modelled by the evolution of continuous state variables for the physical components interleaved with discrete events. Despite large effort and progress in the exhaustive verification of such hybrid systems, the complexity of CPS models limits formal verification of safety of their behaviour only to small instances. An alternative approach, closer to the practice of simulation and testing, is to monitor and to predict CPS behaviours at simulation-time or at runtime. In this chapter, we summarise the state-of-the-art techniques for qualitative and quantitative monitoring of CPS behaviours. We present an overview of some of the important applications and, finally, we describe the tools supporting CPS monitoring and compare their main features.
... design of continuous-time and discrete-time linear stochastic systems [7], [8]. For MTL and its variants, a specificationguided testing framework is proposed in [9] for verification of stochastic cyber-physical systems. Reference [10] proposes a solution to the vehicle routing problem with respect to MTL specifications. ...
Article
Full-text available
This paper studies an optimal control problem for continuous-time stochastic systems subject to reachability objectives specified in a subclass of metric interval temporal logic specifications, a temporal logic with real-time constraints. We propose a probabilistic method for synthesizing an optimal control policy that maximizes the probability of satisfying a specification based on a discrete approximation of the underlying stochastic system. First, we show that the original problem can be formulated as a stochastic optimal control problem in a state space augmented with finite memory and states of some clock variables. Second, we present a numerical method for computing an optimal policy with which the given specification is satisfied with the maximal probability in point-based semantics in the discrete approximation of the underlying system. We show that the policy obtained in the discrete approximation converges to the optimal one for satisfying the specification in the continuous or dense-time semantics as the discretization becomes finer in both state and time. Finally, we illustrate our approach with a robotic motion planning example.
... Therefore, the goal of the optimizer is to eventually tweak the V M 1 and V M 2 signals in a way that would satisfy all the given constraints described in the query (e.g., IQ1-IQ3). Regarding the utilized stochastic optimizer, we employ the Expected Robustness Guided Monte Carlo (ERGMC) algorithm, which is based on simulated annealing and is presented in [32]. We set the number of control points to be equal to the amount of convolution layers of each target DNN, and evenly distribute them. ...
Preprint
Full-text available
Deep Neural Networks (DNNs) are being heavily utilized in modern applications and are putting energy-constraint devices to the test. To bypass high energy consumption issues, approximate computing has been employed in DNN accelerators to balance out the accuracy-energy reduction trade-off. However, the approximation-induced accuracy loss can be very high and drastically degrade the performance of the DNN. Therefore, there is a need for a fine-grain mechanism that would assign specific DNN operations to approximation in order to maintain acceptable DNN accuracy, while also achieving low energy consumption. In this paper, we present an automated framework for weight-to-approximation mapping enabling formal property exploration for approximate DNN accelerators. At the MAC unit level, our experimental evaluation surpassed already energy-efficient mappings by more than $\times2$ in terms of energy gains, while also supporting significantly more fine-grain control over the introduced approximation.
... Several works do not design the complexity of the CPSs in a clear and understandable way. For this reason, another challenge included was presented in [3] as a testing framework for the goal to detect system operating conditions that cause the system to exhibit the worst expected specification robustness. ...
Conference Paper
Full-text available
... Further details on the necessity and implications of the aforementioned assumptions can be found in [7]. Assumption 3 can also be relaxed as shown in [23]. ...
Article
Full-text available
One of the advantages of adopting a model-based development process is that it enables testing and verification at early stages of development. However, it is often desirable to not only verify/falsify certain formal system specifications, but also to automatically explore the properties that the system satisfies. In this work, we present a framework that enables property exploration for cyber-physical systems. Namely, given a parametric specification with multiple parameters, our solution can automatically infer the ranges of parameters for which the property does not hold on the system. In this paper, we consider parametric specifications in metric or Signal Temporal Logic (MTL or STL). Using robust semantics for MTL, the parameter mining problem can be converted into a Pareto optimization problem for which we can provide an approximate solution by utilizing stochastic optimization methods. We include algorithms for the exploration and visualization of multi-parametric specifications. The framework is demonstrated on an industrial size, high-fidelity engine model as well as examples from related literature.
... Of course, CPS verification techniques have been widely investigated within the online verification setting (Section 1.1). For example [17,42] present online approaches using deterministic strategies to select the next disturbance to the SUV, whereas [11,41,1,42,16,33,2,23,18] present online approaches using probabilistic strategies (Monte Carlo simulation) to select the next disturbance to the SUV. ...
Article
Cyber Physical Systems (CPSs) consist of hardware and software components. To verify that the whole (i.e., software + hardware) system meets the given specifications, exhaustive simulation-based approaches (Hardware In the Loop Simulation, HILS) can be effectively used by first generating all relevant simulation scenarios (i.e., sequences of disturbances) and then actually simulating all of them (verification phase). When considering the whole verification activity, we see that the above mentioned verification phase is repeated until no error is found. Accordingly, in order to minimise the time taken by the whole verification activity, in each verification phase we should, ideally, start by simulating scenarios witnessing errors (counterexamples). Of course, to know beforehand the set of such scenarios is not feasible. In this paper we show how to select scenarios so as to minimise the Worst Case Expected Verification Time.
... One natural choice to attain correct functioning is to consider formal methods techniques, such as model checking [1], [2], which have been successfully used in the formal verification and synthesis of digital circuits and software codes [3]. In recent years, we have seen many efforts of extending formal methods to engineering applications, e.g., automobiles [4], [5] and robotics [6], [7]. One crucial component of formal methods is a precise and potentially concise mathematical model of the system under investigation. ...
Article
Full-text available
How to effectively and reliably guarantee the correct functioning of safety-critical cyber-physical systems in uncertain conditions is a challenging problem. This paper presents a data-driven algorithm to derive approximate abstractions for piecewise affine systems with unknown dynamics. It advocates a significant shift from the current paradigm of abstraction, which starts from a model with known dynamics. Given a black-box system with unknown dynamics and a linear temporal logic specification, the proposed algorithm is able to obtain an abstraction of the system with an arbitrarily small error and an arbitrarily large probability. The algorithm consists of three components, system identification, system abstraction, and active sampling. The effectiveness of the algorithm is demonstrated by a case study with a soft robot.
... This problem, dual of verification, attempts not to prove that the system M is correct under all inputs u ∈ U , but simply to find a faulty execution w = M(u), without any formal guarantees that it will be found. The most effective technique, as illustrated in [2,34], turns the falsification problem into the following optimization problem: "minimize ρ(φ, w) subject to w = M(u), u ∈ U ." The robustness value ρ is expected to be continuous in u, and by definition is w ̸ |= φ when ρ(φ, w) < 0. This technique was indeed used in [25,27,29,42] precisely for the purpose of temporal logic parameter exploration. ...
Conference Paper
Full-text available
We describe a new algorithm for the parametric identification problem for signal temporal logic (STL), stated as follows. Given a dense-time real-valued signal w and a parameterized temporal logic formula &phiv;, compute the subset of the parameter space that renders the formula satisfied by the signal. Unlike previous solutions, which were based on search in the parameter space or quantifier elimination, our procedure works recursively on &phiv; and computes the evolution over time of the set of valid parameter assignments. This procedure is similar to that of monitoring or computing the robustness of &phiv; relative to w. Our implementation and experiments demonstrate that this approach can work well in practice.
... The method can be applied to a Simulink / StateflowTM (S / S) model and is implemented in the Matlab toolbox S-TALIRO. Abbas et al. [83] also proposed a robust-guided temporal logic testing (RGTLT) method based on the S-TALIRO toolbox. This method uses the MTL theory to quantify the robustness of a stochastic CPS. ...
Article
Full-text available
Cyber Physical Systems (CPSs) are rapidly developing, with increasing scale, complexity, and heterogeneity. However, testing CPSs systematically to ensure that they operate with high reliability remains a big challenge. Therefore, it is necessary to summarize existing works and technologies systematically, with the aim of inspiring new inventions for more efficient CPS testing. Accordingly, this study first investigated the advances in CPS testing methods from ten aspects, including different testing paradigms, technologies, and some non-functional testing methods (including security testing, robust testing, and fragility testing). Then, we further elaborate on the infrastructures of CPS testbeds from the perspectives of their architecture and the corresponding function analyses. Finally, challenges and future research directions are identified and discussed. It can be concluded that future CPS testing should focus more on the combination of different paradigms and technologies for multi-objective by integrating more emerging cutting-edge technologies such as Internet of things, big data, cloud computing, and AI. OAPA
... In hyper-properties, we can further make a distinction between statistical hyper-properties, i.e. properties that reason about statistical aspects of the system (such as average energy consumption, mean time to failure, etc.), and relational hyper-properties. There has been limited work on estimating statistical properties of CPS models [2], but not much work has been done to verify or falsify statistical hyperproperties. Relational hyper-properties are gaining popularity for expressing security and privacy properties such as information leakage, robust I/O behavior, non-interference, non-inference etc. [112,47]. ...
Chapter
Full-text available
Modern cyber-physical systems (CPS) are often developed in a model-based development (MBD) paradigm. The MBD paradigm involves the construction of different kinds of models: (1) a plant model that encapsulates the physical components of the system (e.g., mechanical, electrical, chemical components) using representations based on differential and algebraic equations, (2) a controller model that encapsulates the embedded software components of the system, and (3) an environment model that encapsulates physical assumptions on the external environment of the CPS application. In order to reason about the correctness of CPS applications, we typically pose the following question: For all possible environment scenarios, does the closed-loop system consisting of the plant and the controller exhibit the desired behavior? Typically, the desired behavior is expressed in terms of properties that specify unsafe behaviors of the closed-loop system. Often, such behaviors are expressed using variants of real-time temporal logics. In this chapter, we will examine formal methods based on bounded-time reachability analysis, simulation-guided reachability analysis, deductive techniques based on safety invariants, and formal, requirement-driven testing techniques. We will review key results in the literature, and discuss the scalability and applicability of such systems to various academic and industrial contexts. We conclude this chapter by discussing the challenge to formal verification and testing techniques posed by newer CPS applications that use AI-based software components.
... Remark 1: In this paper, the function sim is assumed to be deterministic; however, the results we present are also applicable to stochastic systems (i.e., when the sim function is stochastic). See [47] for a discussion. ...
Preprint
Autonomous vehicles are complex systems that are challenging to test and debug. A requirements-driven approach to the development process can decrease the resources required to design and test these systems, while simultaneously increasing the reliability. We present a testing framework that uses signal temporal logic (STL), which is a precise and unambiguous requirements language. Our framework evaluates test cases against the STL formulae and additionally uses the requirements to automatically identify test cases that fail to satisfy the requirements. One of the key features of our tool is the support for machine learning (ML) components in the system design, such as deep neural networks. The framework allows evaluation of the control algorithms, including the ML components, and it also includes models of CCD camera, lidar, and radar sensors, as well as the vehicle environment. We use multiple methods to generate test cases, including covering arrays, which is an efficient method to search discrete variable spaces. The resulting test cases can be used to debug the controller design by identifying controller behaviors that do not satisfy requirements. The test cases can also enhance the testing phase of development by identifying critical corner cases that correspond to the limits of the system's allowed behaviors. We present STL requirements for an autonomous vehicle system, which capture both component-level and system-level behaviors. Additionally, we present three driving scenarios and demonstrate how our requirements-driven testing framework can be used to identify critical system behaviors, which can be used to support the development process.
... In addition to the reported lessons learned, future work could focus on (i) online signal generation for identifying challenging and arbitrarily shaped input signals for the software under test [40], (ii) searching in the scenario space for automatically constructing meaningful scenarios [41], and (iii) finding arguments on when to stop testing in case no falsifying runs are found [42], [43]. Despite promising initial works, there are still many open questions, particularly in the industrial setting. ...
... Methods based on Statiscial Model Checking (SMC) [19], [20], [41] can overcome the hurdles like scalability and nonlinearity and provide probabilistic guarantees [40], [30], [38], [4], [1]. These methods are based on stochastical inference methods like sequential probability ratio tests [19], [30], [32], [4], Bayesian statistics [41], and Clopper-Pearson bounds [40]. ...
Preprint
The use of machine learning components has posed significant challenges for the verification of cyber-physical systems due to its complexity, nonlinearity, and large space of parameters. In this work, we propose a novel probabilistic verification framework for learning-enabled CPS which can search over the entire (infinite) space of parameters, to figure out the ones that lead to satisfaction or violation of specification that are captured by Signal Temporal Logic (STL) formulas. Our technique is based on conformal regression, a technique for constructing prediction intervals with marginal coverage guarantees using finite samples, without making assumptions on the distribution and regression model. Our verification framework, using conformal regression, can predict the quantitative satisfaction values of the system's trajectories over different sets of the parameters and use those values to quantify how well/bad the system with the parameters can satisfy/violate the given STL property. We use three case studies of learning-enabled CPS applications to demonstrate that our technique can be successfully applied to partition the parameter space and provide the needed level of assurance.
... In fact, SBTG methods have already established that they can falsify benchmark problems at a fraction of the cost of Monte-Carlo sampling, or even when Monte-Carlo sampling fails, e.g., [4,6]. However, current SBTG methods (with exception maybe of [22,23]) cannot answer an important open question: what are the conclusions to be drawn if no falsifying behavior has been discovered? In other words, when the test/simulation budget is exhausted and no violation has been discovered, can we conclude that the SUT is safe, or that at least it is likely safe? ...
Preprint
Full-text available
Requirements driven search-based testing (also known as falsification) has proven to be a practical and effective method for discovering erroneous behaviors in Cyber-Physical Systems. Despite the constant improvements on the performance and applicability of falsification methods, they all share a common characteristic. Namely, they are best-effort methods which do not provide any guarantees on the absence of erroneous behaviors (falsifiers) when the testing budget is exhausted. The absence of finite time guarantees is a major limitation which prevents falsification methods from being utilized in certification procedures. In this paper, we address the finite-time guarantees problem by developing a new stochastic algorithm. Our proposed algorithm not only estimates (bounds) the probability that falsifying behaviors exist, but also it identifies the regions where these falsifying behaviors may occur. We demonstrate the applicability of our approach on standard benchmark functions from the optimization literature and on the F16 benchmark problem.
Chapter
Logical specifications have enabled formal methods by carefully describing what is correct, desired or expected of a given system. They have been widely used in runtime monitoring and applied to domains ranging from medical devices to information security. In this tutorial, we will present the theory and application of robustness of logical specifications. Rather than evaluate logical formulas to Boolean valuations, robustness interpretations attempt to provide numerical valuations that provide degrees of satisfaction, in addition to true/false valuations to models. Such a valuation can help us distinguish between behaviors that “barely” satisfy a specification to those that satisfy it in a robust manner. We will present and compare various notions of robustness in this tutorial, centered primarily around applications to safety-critical Cyber-Physical Systems (CPS). We will also present key ways in which the robustness notions can be applied to problems such as runtime monitoring, falsification search for finding counterexamples, and mining design parameters for synthesis.
Conference Paper
We propose a novel passive learning approach, TeLEx, to infer signal temporal logic formulas that characterize the behavior of a dynamical system using only observed signal traces of the system. The approach requires two inputs: a set of observed traces and a template Signal Temporal Logic (STL) formula. The unknown parameters in the template can include time-bounds of the temporal operators, as well as the thresholds in the inequality predicates. TeLEx finds the value of the unknown parameters such that the synthesized STL property is satisfied by all the provided traces and it is tight. This requirement of tightness is essential to generating interesting properties when only positive examples are provided and there is no option to actively query the dynamical system to discover the boundaries of legal behavior. We propose a novel quantitative semantics for satisfaction of STL properties which enables TeLEx to learn tight STL properties without multidimensional optimization. The proposed new metric is also smooth. This is critical to enable use of gradient-based numerical optimization engines and it produces a 30$$\times$$–100$$\times$$ speed-up with respect to the state-of-art gradient-free optimization. The approach is implemented in a publicly available tool.
Article
Full-text available
Autonomous vehicles are complex systems that are challenging to test and debug. A requirements-driven approach to the development process can decrease the resources required to design and test these systems, while simultaneously increasing the reliability. We present a testing framework that uses signal temporal logic (STL), which is a precise and unambiguous requirements language. Our framework evaluates test cases against the STL formulae and additionally uses the requirements to automatically identify cases that fail to satisfy the requirements. One of the key features of our tool is the support for machine learning (ML) components in the system design, such as deep neural networks. Our framework includes evaluation of the control algorithms, including the ML components, and it also includes models of CCD camera, lidar, and radar sensors, as well as the vehicle environment. We use multiple methods to generate test cases, including covering arrays, which is an efficient method to search discrete variable spaces.The resulting test cases can be used to debug the controller design by identifying controller behaviors that do not satisfy requirements. The test cases can also enhance the testing phase of development by identifying critical corner cases that correspond to the limits of the system's allowed behaviors.
Article
We present a counterexample-guided inductive synthesis approach to controller synthesis for cyber-physical systems sub- ject to signal temporal logic (STL) specifications, operating in potentially adversarial nondeterministic environments. We encode STL specifications as mixed integer-linear constraints on the variables of a discrete-time model of the system and environment dynamics, and solve a series of optimization problems to yield a satisfying control sequence. We demonstrate how the scheme can be used in a receding horizon fashion to fulfill properties over unbounded horizons, and present experimental results for reactive controller synthesis for case studies in building climate control and autonomous driving.
Chapter
In this paper, we conduct a preliminary comparative study of classification of longitudinal driving behavior using Signal Temporal Logic (STL) formulas. The goal of the classification problem is to distinguish between different driving styles or vehicles. The results can be used to design and test autonomous vehicle policies. We work on a real-life dataset, the Highway Drone Dataset (HighD). To solve this problem, our first approach starts with a formula template and reduces the classification problem to a Mixed-Integer Linear Program (MILP). Solving MILPs becomes computationally challenging with increasing number of variables and constraints. We propose two improvements to split the classification problem into smaller ones. We prove that these simpler problems are related to the original classification problem in a way that their feasibility imply that of the original. Finally, we compare our MILP formulation with an existing STL-based classification tool, LoTuS, in terms of accuracy and execution time.Keywordsdriving behaviorSTL classificationformal methods
Article
Full-text available
We present a Monte-Carlo optimization technique for finding system behaviors that falsify a metric temporal logic (MTL) property. Our approach performs a random walk over the space of system inputs guided by a robustness metric defined by the MTL property. Robustness is guiding the search for a falsifying behavior by exploring trajectories with smaller robustness values. The resulting testing framework can be applied to a wide class of cyber-physical systems (CPS). We show through experiments on complex system models that using our framework can help automatically falsify properties with more consistency as compared to other means, such as uniform sampling.
Conference Paper
Full-text available
S-TALIRO is a software toolbox that performs stochastic search for system trajectories that falsify realtime temporal logic specifications. S-TaLiRo is founded on the notion of robustness of temporal logic specifications. In this paper, we present a dynamic programming algorithm for computing the robustness of temporal logic specifications with respect to system trajectories. We also demonstrate that typical automotive functional requirements can be captured and falsified using temporal logics and S-TALIRO.
Article
Full-text available
Stochastic models such as Continuous-Time Markov Chains (CTMC) and Stochastic Hybrid Automata (SHA) are powerful formalisms to model and to reason about the dynamics of biological systems, due to their ability to capture the stochasticity inherent in biological processes. A classical question in formal modelling with clear relevance to biological modelling is the model checking problem. i.e. calculate the probability that a behaviour, expressed for instance in terms of a certain temporal logic formula, may occur in a given stochastic process. However, one may not only be interested in the notion of satisfiability, but also in the capacity of a system to mantain a particular emergent behaviour unaffected by the perturbations, caused e.g. from extrinsic noise, or by possible small changes in the model parameters. To address this issue, researchers from the verification community have recently proposed several notions of robustness for temporal logic providing suitable definitions of distance between a trajectory of a (deterministic) dynamical system and the boundaries of the set of trajectories satisfying the property of interest. The contributions of this paper are twofold. First, we extend the notion of robustness to stochastic systems, showing that this naturally leads to a distribution of robustness scores. By discussing two examples, we show how to approximate the distribution of the robustness score and its key indicators: the average robustness and the conditional average robustness. Secondly, we show how to combine these indicators with the satisfaction probability to address the system design problem, where the goal is to optimize some control parameters of a stochastic model in order to best maximize robustness of the desired specifications.
Article
Full-text available
Many applications require optimizing an unknown, noisy function that is expensive to evaluate. We formalize this task as a multiarmed bandit problem, where the payoff function is either sampled from a Gaussian process (GP) or has low norm in a reproducing kernel Hilbert space. We resolve the important open problem of deriving regret bounds for this setting, which imply novel convergence rates for GP optimization. We analyze an intuitive Gaussian process upper confidence bound (${\ssr GP}\hbox{-}{\ssr UCB})$ algorithm, and bound its cumulative regret in terms of maximal information gain, establishing a novel connection between GP optimization and experimental design. Moreover, by bounding the latter in terms of operator spectra, we obtain explicit sublinear regret bounds for many commonly used covariance functions. In some important cases, our bounds have surprisingly weak dependence on the dimensionality. In our experiments on real sensor data, ${\ssr GP}\hbox{-}{\ssr UCB}$ compares favorably with other heuristical GP optimization approaches.
Article
Full-text available
This paper presents novel extensions and applications of the UPPAAL-SMC model checker. The extensions allow for statistical model checking of stochastic hybrid systems. We show how our race-based stochastic semantics extends to networks of hybrid systems, and indicate the integration technique applied for implementing this semantics in the UPPAAL-SMC simulation engine. We report on two applications of the resulting tool-set coming from systems biology and energy aware buildings.
Article
Full-text available
We introduce bounds on the finite-time performance of Markov chain Monte Carlo (MCMC) algorithms in solving global stochastic optimization problems defined over continuous domains. It is shown that MCMC algorithms with finite-time guarantees can be developed with a proper choice of the target distribution and by studying their convergence in total variation norm. This work is inspired by the concept of finite-time learning with known accuracy and confidence developed in statistical learning theory.
Conference Paper
Full-text available
The systematic exploration of the space of all the behaviours of a software system forms the basis of numerous approaches to verification. However, existing approaches face many challenges with scalability and precision. We propose a framework for validating programs based on statistical sam- pling of inputs guided by statically generated constraints, that steer the simulations towards more "desirable" traces. Our approach works iteratively: each iteration first simu- lates the system on some inputs sampled from a restricted space, while recording facts about the simulated traces. Sub- sequent iterations of the process attempt to steer the future simulations away from what has already been seen in the past iterations. This is achieved by two separate means: (a) we perform symbolic executions in order to guide the choice of inputs, and (b) we sample from the input space using a probability distribution specified by means of previ- ously observed test data using a Markov Chain Monte-Carlo (MCMC) technique. As a result, the sampled inputs gener- ate traces that are likely to be significantly different from the observations in the previous iterations in some user specified ways. We demonstrate that our approach is effective .I t can rapidly isolate rare behaviours of systems that reveal more bugs. Categories and Subject Descriptors: D.2.5 (Testing
Conference Paper
Full-text available
We address the problem of model checking stochastic systems, i.e.~checking whether a stochastic system satisfies a certain temporal property with a probability greater (or smaller) than a fixed threshold. In particular, we present a novel Statistical Model Checking (SMC) approach based on Bayesian statistics. We show that our approach is feasible for hybrid systems with stochastic transitions, a generalization of Simulink/Stateflow models. Standard approaches to stochastic (discrete) systems require numerical solutions for large optimization problems and quickly become infeasible with larger state spaces. Generalizations of these techniques to hybrid systems with stochastic effects are even more challenging. The SMC approach was pioneered by Younes and Simmons in the discrete and non-Bayesian case. It solves the verification problem by combining randomized sampling of system traces (which is very efficient for Simulink/Stateflow) with hypothesis testing or estimation. We believe SMC is essential for scaling up to large Stateflow/Simulink models. While the answer to the verification problem is not guaranteed to be correct, we prove that Bayesian SMC can make the probability of giving a wrong answer arbitrarily small. The advantage is that answers can usually be obtained much faster than with standard, exhaustive model checking techniques. We apply our Bayesian SMC approach to a representative example of stochastic discrete-time hybrid system models in Stateflow/Simulink: a fuel control system featuring hybrid behavior and fault tolerance. We show that our technique enables faster verification than state-of-the-art statistical techniques, while retaining the same error bounds. We emphasize that Bayesian SMC is by no means restricted to Stateflow/Simulink models: we have in fact successfully applied it to very large stochastic models from Systems Biology.
Conference Paper
Full-text available
This paper considers the quantitative verification of discrete-time stochastic hybrid systems (DTSHS) against linear time objectives. The central question is to determine the likelihood of all the trajectories in a DTSHS that are accepted by an automaton on finite or infinite words. This verification covers regular and ω-regular properties, and thus comprises the linear temporal logic LTL. This work shows that these quantitative verification problems can be reduced to computing reachability probabilities over the product of an automaton and the DTSHS under study. The computation of reachability probabilities can be performed in a backward-recursive manner, and quantitatively approximated by procedures over discrete-time Markov chains. A case study shows the feasibility of the approach.
Conference Paper
Full-text available
Abstract Probabilistic model checking is an automatic formal verifi - cation technique for analysing quantitative properties of systems which exhibit stochastic behaviour PRISM is a probabilistic model checking tool which has already been successfully deployed in a wide range of application domains, from real - time communication protocols to biolog - ical signalling pathways The tool has recently undergone a significant amount of development Major additions include facilities to manually explore models, Monte - Carlo discrete - event simulation techniques for ap - proximate model analysis (including support for distributed simulation) and the ability to compute cost - and reward - based measures, e g "the ex - pected energy consumption of the system before the first failure occurs" This paper presents an overview of all the main features of PRISM More information can be found on the website: www cs bham ac uk/?dxp/prism
Conference Paper
Full-text available
S-TaLiRo is a Matlab (TM) toolbox that searches for trajectories of minimal robustness in Simulink/Stateflow diagrams. It can analyze arbitrary Simulink models or user defined functions that model the system. At the heart of the tool, we use randomized testing based on stochastic optimization techniques including Monte-Carlo methods and Ant-Colony Optimization. Among the advantages of the toolbox is the seamless integration inside the Matlab environment, which is widely used in the industry for model-based development of control software. We present the architecture of S-TaLiRo and its working on an application example.
Conference Paper
Full-text available
We present MC 2, what we believe to be the first randomized, Monte Carlo algorithm for temporal-logic model checking. Given a specification S of a finite-state system, an LTL formula ϕ, and parameters ε and δ, MC 2 takes M = ln (δ) / ln (1 – ε) random samples (random walks ending in a cycle, i.e lassos) from the Büchi automaton B = B S ×B ¬ϕ . to decide if L(B) = ∅. Let p Z be the expectation of an accepting lasso in B. Should a sample reveal an accepting lasso l, MC 2 returns false with l as a witness. Otherwise, it returns true and reports that the probability of finding an accepting lasso through further sampling, under the assumption that p Z ≥ ε, is less than δ. It does so in time O(MD) and space O(D), where D is B’s recurrence diameter, using an optimal number of samples M. Our experimental results demonstrate that MC 2 is fast, memory-efficient, and scales extremely well.
Article
Temporal logic verification has been proven to be a successful tool for the analysis of software and hardware systems. For such systems, both the models and the logic are Boolean valued. In the past, similar successful results have been derived for timed and linear hybrid systems. Even though the states of these systems are real valued, temporal logics are still interpreted over Boolean signals that abstract away the actual values of the real-valued state variables. ^ In this thesis, we advocate that in certain cases it is beneficial to define multi-valued semantics for temporal logics. That is, we consider a robust interpretation of Metric Temporal Logic (MTL) formulas over signals that take values in metric spaces. For such signals, which are generated by systems whose states are equipped with nontrivial metrics, for example continuous or hybrid, robustness is not only natural, but also a critical measure of system performance. The proposed multi-valued semantics for MTL formulas captures not only the usual Boolean satisfiability of the formula, but also topological information regarding the distance from unsatisfiability. This, in turn, enables the definition of robustness tubes that contain signals with the same temporal properties. ^ The notion of robustness for MTL specifications can be applied to at least 3 important problems. The first problem is the verification of continuous time signals with respect to MTL specifications using only discrete time analysis. The motivating idea behind our approach is that if the continuous time signal fulfills certain conditions and the discrete time signal robustly satisfies the MTL specification, then the corresponding continuous time signal should also satisfy the same MTL specification. Second, the proposed robustness framework can be applied to the problem of bounded time temporal logic verification of dynamical systems. Our methodology has the distinctive feature that enables the verification of temporal properties of a dynamical system by checking only a finite number of its (simulated) trajectories. The interesting and promising feature of this approach is that the more robust the system is with respect to the temporal logic specification, the less is the number of simulations that are required in order to verify the system. Finally, the proposed definition of robustness for temporal logic specifications can be applied to the problem of automatic synthesis of hybrid systems. In particular, we address the problem of temporal logic motion planning for mobile robots that are modeled by second order dynamics. Temporal logic specifications can capture the usual control specifications such as reachability and invariance as well as more complex specifications like sequencing and obstacle avoidance. The resulting continuous time trajectory is provably guaranteed to satisfy the user specification.
Conference Paper
In Model Based Development (MBD) of embedded systems, it is often desirable to not only verify/falsify certain formal system specifications, but also to automatically explore the properties that the system satisfies. Namely, given a parametric specification, we would like to automatically infer the ranges of parameters for which the property holds/does not hold on the system. In this paper, we consider parametric specifications in Metric Temporal Logic (MTL). Using robust semantics for MTL, the parameter estimation problem can be converted into an optimization problem which can be solved by utilizing stochastic optimization methods. The framework is demonstrated on some examples from the literature.
Chapter
1.2 Modeling with timed and hybrid automata................... 3
Book
STOCHASTIC HYBRID SYSTEMS: RESEARCH ISSUES AND AREAS Christos G. Cassandras and John Lygeros Introduction Modeling of Nondeterministic Hybrid Systems Modeling of Stochastic Hybrid Systems Overview of This Volume STOCHASTIC DIFFERENTIAL EQUATIONS ON HYBRID STATE SPACES Jaroslav Krystul, Henk A.P. Blom, and Arunabha Bagchi Introduction Semimartingales and Characteristics Semimartingale Strong Solution of SDE Stochastic Hybrid Processes as Solutions of SDE Instantaneous Hybrid Jumps at a Boundary Related SDE models on Hybrid State Spaces Markov and Strong Markov Properties Concluding Remarks COMPOSITIONAL MODELING OF STOCHASTIC HYBRID SYSTEMS Stefan Strubbe and Arjan van der Schaft Introduction Semantical Models Communicating PDPs Conclusions STOCHASTIC MODEL CHECKING Joost-Pieter Katoen Introduction The Discrete-Time Setting The Continuous-Time Setting Bisimulation and Simulation Relations Epilogue STOCHASTIC REACHABILITY: THEORY AND NUMERICAL APPROXIMATION Maria Prandini and Jianghai Hu Introduction Stochastic Hybrid System Model Reachability Problem Formulation Numerical Approximation Scheme Reachability Computations Possible Extensions Some Examples Conclusion STOCHASTIC FLOW SYSTEMS: MODELING AND SENSITIVITY ANALYSIS Christos G. Cassandras Introduction Modeling Stochastic Flow Systems Sample Paths of Stochastic Flow Systems Optimization Problems in Stochastic Flow Systems Infinitesimal Perturbation Analysis (IPA) Conclusions PERTURBATION ANALYSIS FOR STOCHASTIC FLOW SYSTEMS WITH FEEDBACK Yorai Wardi, George Riley, and Richelle Adams Introduction SFM with Flow Control Retransmission-Based Model Simulation Experiments Conclusions STOCHASTIC HYBRID MODELING OF ON-OFF TCP FLOWS Joao Hespanha Related Work A Stochastic Model for TCP Analysis of the TCP SHS Models Reduced-Order Models Conclusions STOCHASTIC HYBRID MODELING OF BIOCHEMICAL PROCESSES Panagiotis Kouretas, Konstantinos Koutroumpas, John Lygeros, and Zoi Lygerou Introduction Overview of PDMP Subtilin Production by B. subtilis DNA Replication in the Cell Cycle Concluding Remarks FREE FLIGHT COLLISION RISK ESTIMATION BY SEQUENTIAL MC SIMULATION Henk A.P. Blom, Jaroslav Krystul, G.J. (Bert) Bakker, Margriet B. Klompstra, and Bart Klein Obbink Introduction Sequential MC Estimation of Collision Risk Development of a Petri Net Model of Free Flight Simulated Scenarios and Collision Risk Estimates Concluding Remarks INDEX
Article
In this paper, we consider the robust interpretation of Metric Temporal Logic (MTL) formulas over signals that take values in metric spaces. For such signals, which are generated by systems whose states are equipped with non-trivial metrics, for example continuous or hybrid, robustness is not only natural, but also a critical measure of system performance. Thus, we propose multi-valued semantics for MTL formulas, which capture not only the usual Boolean satisfiability of the formula, but also topological information regarding the distance, ε, from unsatisfiability. We prove that any other signal that remains ε-close to the initial one also satisfies the same MTL specification under the usual Boolean semantics. Finally, our framework is applied to the problem of testing formulas of two fragments of MTL, namely Metric Interval Temporal Logic (MITL) and closed Metric Temporal Logic (clMTL), over continuous-time signals using only discrete-time analysis. The motivating idea behind our approach is that if the continuous-time signal fulfills certain conditions and the discrete-time signal robustly satisfies the temporal logic specification, then the corresponding continuous-time signal should also satisfy the same temporal logic specification.
Conference Paper
Hybrid automata offer a framework for the description of systems with both discrete and continuous components, such as digital technology embedded in an analogue environment. Traditional uses of hybrid automata express choice of transitions purely in terms of nondeterminism, abstracting potentially significant information concerning the relative likelihood of certain behaviours. To model such probabilistic information, we present a variant of hybrid automata augmented with discrete probability distributions. We concentrate on restricted subclasses of the model in order to obtain decidable model checking algorithms for properties expressed in probabilistic temporal logics.
Conference Paper
We describe Breach, a Matlab/C++ toolbox providing a coherent set of simulation-based techniques aimed at the analysis of deterministic models of hybrid dynamical systems. The primary feature of Breach is to facilitate the computation and the property investigation of large sets of trajectories. It relies on an efficient numerical solver of ordinary differential equations that can also provide information about sensitivity with respect to parameters variation. The latter is used to perform approximate reachability analysis and parameter synthesis. A major novel feature is the robust monitoring of metric interval temporal logic (MITL) formulas. The application domain of Breach ranges from embedded systems design to the analysis of complex non-linear models from systems biology.
Article
This paper is motivated by the need for a formal specification method for real-time systems. In these systemsquantitative temporal properties play a dominant role. We first characterize real-time systems by giving a classification of such quantitative temporal properties. Next, we extend the usual models for temporal logic by including a distance function to measure time and analyze what restrictions should be imposed on such a function. Then we introduce appropriate temporal operators to reason about such models by turning qualitative temporal operators into (quantitative) metric temporal operators and show how the usual quantitative temporal properties of real-time systems can be expressed in this metric temporal logic. After we illustrate the application of metric temporal logic to real-time systems by several examples, we end this paper with some conclusions.
Article
Finding mathematical models satisfying a specification built from the formalization of biological experiments, is a common task of the modeler that techniques like model-checking help solving, in the qualitative but also in the quantitative case. In this article we define a continuous degree of satisfaction of temporal logic formulae with constraints. We show how such a satisfaction measure can be used as a fitness function with state-of-the-art evolutionary optimization methods in order to find biochemical kinetic parameter values satisfying a set of biological properties formalized in temporal logic. We also show how it can be used to define a measure of robustness of a biological model with respect to some temporal specification. These methods are evaluated on models of the cell cycle and of the MAPK signaling cascade.
Conference Paper
We propose a model independent procedure for verifying properties of discrete event systems. The dynamics of such systems can be very complex, making them hard to analyze, so we resort to methods based on Monte Carlo simulation and statistical hypothesis testing. The verification is probabilistic in two senses. First, the properties, expressed as CSL formulas, can be probabilistic. Second, the result of the verification is probabilistic, and the probability of error is bounded by two parameters passed to the verification procedure. The verification of properties can be carried out in an anytime manner by starting off with loose error bounds, and gradually tightening these bounds.
Article
this paper we explore different, simulation based strategies. Section 2 reviews the basic concept of Monte Carlo simulation for the evaluation of U(d), including a proposal of "borrowing strength" across simulations under different designs d by smoothing through simulated pairs of design and observed utilities. Section 3 approaches the problem with very different strategies, using a model augmentation which defines an artificial probability model on the triple of design, data and parameters. Simulation in the augmented model is shown to be equivalent to solving the optimal design problem (1). Critical shortcomings of the proposed approach are problems arising with flat and high dimensional expected utility surfaces. Section 4 proposes an idea reminiscent of simulated annealing which replaces the expected utility surface by a more peaked surface without changing the solution of the optimal design problem. 2. PRIOR SIMULATION Except in special problems, the stochastic optimization problem (1) does not allow a closed form solution. Often the utility function u(Delta) is chosen to allow analytic evaluation even if a more problem specific utility/loss function were available. A typical example is the use of preposterior variance on some parameters of interest instead of total treatment success in medical decision problems. More realistic utility functions can be used in simulation based solutions to the optimal design problem. Most simulation based methods for optimal design are based on the observation that the integral in U(d) is easily evaluated by Monte Carlo simulation. In most problems p d (`; y) =