ArticlePublisher preview available

Interactive WCET Prediction with Warning for Timeout Risk

World Scientific
International Journal of Pattern Recognition and Artificial Intelligence
Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Worst case execution time (WCET) analysis is essential for exposing timeliness defects when developing hard real-time systems. However, it is too late to fix timeliness defects cheaply since developers generally perform WCET analysis in a final verification phase. To help developers quickly identify real timeliness defects in an early programming phase, a novel interactive WCET prediction with warning for timeout risk is proposed. The novelty is that the approach not only fast estimates WCET based on a control flow tree (CFT), but also assesses the estimated WCET with a trusted level by a lightweight false path analysis. According to the trusted levels, corresponding warnings will be triggered once the estimated WCET exceeds a preset safe threshold. Hence developers can identify real timeliness defects more timely and efficiently. To this end, we first analyze the reasons of the overestimation of CFT-based WCET calculation; then we propose a trusted level model of timeout risks; for recognizing the structural patterns of timeout risks, we develop a risk data counting algorithm; and we also give some tactics for applying our approach more effectively. Experimental results show that our approach has almost the same running speed compared with the fast and interactive WCET analysis, but it saves more time in identifying real timeliness defects.
Interactive WCET Prediction with Warning
for Timeout Risk
Fanqi Meng
*
,
,
, Xiaohong Su
*
,
§
and Zhaoyang Qu
,
*
School of Computer Science and Technology
Harbin Institute of Technology, Harbin 150001, P. R. China
School of Information Engineering
Northeast Dianli University, Jilin 132012, P. R. China
mengfanqi@nedu.edu.cn
§
sxh@hit.edu.cn
quzhaoyang@nedu.edu.cn
Received 8 August 2016
Accepted 19 September 2016
Published 30 November 2016
Worst case execution time (WCET) analysis is essential for exposing timeliness defects when
developing hard real-time systems. However, it is too late to ¯x timeliness defects cheaply since
developers generally perform WCET analysis in a ¯nal veri¯cation phase. To help developers
quickly identify real timeliness defects in an early programming phase, a novel interactive
WCET prediction with warning for timeout risk is proposed. The novelty is that the approach
not only fast estimates WCET based on a control °ow tree (CFT), but also assesses the
estimated WCET with a trusted level by a lightweight false path analysis. According to the
trusted levels, corresponding warnings will be triggered once the estimated WCET exceeds a
preset safe threshold. Hence developers can identify real timeliness defects more timely and
e±ciently. To this end, we ¯rst analyze the reasons of the overestimation of CFT-based WCET
calculation; then we propose a trusted level model of timeout risks; for recognizing the structural
patterns of timeout risks, we develop a risk data counting algorithm; and we also give some
tactics for applying our approach more e®ectively. Experimental results show that our approach
has almost the same running speed compared with the fast and interactive WCET analysis, but
it saves more time in identifying real timeliness defects.
Keywords: Pattern recognition; software safety; interactive WCET analysis; risk warning;
embedded Java.
1. Introduction
Worst case execution time (WCET) is an important issue in the research ¯eld of real-
time systems because it is a key parameter to both the schedulability analysis of
tasks
14,36,53
55
and the safety validation of software.
11,20,28
With the development of
§
Corresponding author.
International Journal of Pattern Recognition
and Arti¯cial Intelligence
Vol. 31, No. 5 (2017) 1750012 (24 pages)
#
.
cWorld Scienti¯c Publishing Company
DOI: 10.1142/S0218001417500124
1750012-1
... In real-time systems, especially the emerging safety-critical cyber-physical system (CPS), such as active braking system of automobiles, automatic cruise system of unmanned aerial vehicles, relay protection system of a smart grid [2,3], etc., program's execution time is vital. Even in the worst case, the execution time must not exceed the deadline [4]. Otherwise it may cause disastrous consequences. ...
... When coding, programmers should pay attention to the WCET of each component at any time. Once the WCET is found to exceed the expected time, the program is deemed to have a timeliness defect and should be repaired immediately [4]. On this basis, the program can be further optimized at the later performance optimization stage to achieve better average performance and WCET. ...
Article
Full-text available
For safety-critical real-time software, if worst-case execution time (WCET) violates a time constraint, it is considered having a timeliness defect. To fix the defect early with lower cost, a WCET optimization strategy is proposed based on source code refactoring. The strategy guides programmers to search refactoring opportunities in the correct positions and perform refactorings by a reasonable sequence. To this end, the worst-case execution path (WCEP) of a target program is firstly extracted from its control flow graph. Then the WCEP is mapped onto source code by the back-annotation technique. An abstract syntax tree-based invariant path identification algorithm is developed for recognizing the invariant paths from the source-level WCEP. According to the invariant paths and loop statements, the source code is divided into four optimization regions with different priorities. Thus the searching scopes are reduced, and invalid refactorings are avoided. On the basis, the refactoring which has the lowest cost in the same region is performed first. To support the strategy, a cost model of source code refactoring is designed. It mainly considers adverse effects of refactorings on the maintainability of source code. The experimental results showed that the optimization strategy reduced WCET effectively and maximally kept the maintainability. Therefore it is more suitable for WCET optimization in an early programming phase. It is helpful to fix the defects early and then guarantee the timeliness safety of the software.
... IPET-based WCET calculation[30]. ...
Article
Full-text available
In order to reduce overestimations of worst-case execution time (WCET), in this article, we firstly report a kind of specific WCET overestimation caused by non-orthogonal nested loops. Then, we propose a novel correction approach which has three basic steps. The first step is to locate the worst-case execution path (WCEP) in the control flow graph and then map it onto source code. The second step is to identify non-orthogonal nested loops from the WCEP by means of an abstract syntax tree. The last step is to recursively calculate the WCET errors caused by the loose loop bound constraints, and then subtract the total errors from the overestimations. The novelty lies in the fact that the WCET correction is only conducted on the non-branching part of WCEP, thus avoiding potential safety risks caused by possible WCEP switches. Experimental results show that our approach reduces the specific WCET overestimation by an average of more than 82%, and 100% of corrected WCET is no less than the actual WCET. Thus, our approach is not only effective but also safe. It will help developers to design energy-efficient and safe real-time systems.
Chapter
Because the existing LDA model is difficult to determine the number of topics and the key point of time, it is difficult to explain the topic result accurately. In this paper, the DTS-ILDA model is proposed, which fused an improved clustering algorithm into the DTM model, and label information is used for supervised learning on each subset. The size of the sliding window varies according to the topic distribution characteristics in this model. Text segmentation can be achieved more reasonable. The number of topics is also variable and easy to understand. The experiment shows that this method can effectively find the time points of important changes in the topic content, and prevent insignificance topics. It can reduce the related interference of the wrong topics and dig out the exact deep relationship at the same time.
Chapter
To solve the problems of visualization for power information, such as single manifestations, low-efficient and not visual, a general three-dimensional power server operations information visualization model was presented in this paper. First, the construction method of server related device models was given. Then, the power server operations information visual scene building method was proposed. This included rapid scene organization strategy, and exploration on the storage and re-use mechanism of three-dimensional visualization scene. Moreover, collision detection using swarm intelligence and bionic computing to solve its existing problems. Finally, The feasibility and practicality of the method was verified by a developed intelligent power virtual simulation platform based on JME (JMonkeyEngine).
Article
Full-text available
For the safety of real-time systems, it is very important that the execution time of programs must meet all time constraints, even under the worst case. To expose timeliness defects which may cause an execution timeout as early as possible, we have studied a novel nonlinear approach for estimating worst case execution time (WCET) during programming phase, called NL-WCET. In this paper, we propose a program features model, based on which NL-WCET employs least square support vector machine (LSSVM) to learn the program features, and then estimates WCET. To improve the accuracy of NL-WCET, we develop an algorithm for training samples optimization. The experimental results show that both the model and the algorithm have distinct effects on the accuracy of NL-WCET. When static similarity is \ge 80 %, cosine similarity is \ge 99.5 % and max quotient between corresponding items is \le 50, the average error of NL-WCET is merely 0.82 %, quite lower than conventional WCET measurement. Meanwhile it also has higher efficiency than conventional WCET analysis. Thus NL-WCET is suitable for being used during programming phase, and can help programmers to discover timeliness defects as early as possible.
Article
Full-text available
Worst-case execution time (WCET) analysis of a program is important to verify the temporal correctness of real-time systems. Parametric WCET analysis represents the WCET of the program as a formula, where the unknown values affecting the WCET are parameterized. Many issues usually affect the WCET of a program, including the loop bound. In parametric timing analysis, instead of determining a constant upper bound for a loop, a symbolic formula represents the loop bound. In this paper, a new method is presented for the parametric loop bound analysis based on path analysis. Instead of considering the basic bocks on their own and independent of the rest, the execution paths within the loop body have to be analyzed. There are certain situations in which the execution of certain statements of an execution path affects the number of executions of all the basic blocks along the execution path. Therefore, more accurate estimation of the number of loop iterations is provided. The results of analysis on the Malardalen benchmark suite reveal the accuracy of the proposed method.
Article
In embedded systems, SPM (scratchpad memory) is an attractive alternative to cache memory due to its lower energy consumption and higher predictability of program execution. This paper studies the problem of placing variables of a program into an SPM such that its WCET (worst-case execution time) is minimized. We propose an efficient dynamic approach that comprises two novel heuristics. The first heuristic iteratively selects a most beneficial variable as an SPM resident candidate based on its impact on the k longest paths of the program. The second heuristic incrementally allocates each SPM resident candidate to the SPM based on graph coloring and acyclic graph orientation. We have evaluated our approach by comparing with an ILP-based approach and a longest-path-based greedy approach using the eight benchmarks selected from Powerstone and Mälardalen WCET Benchmark suites under three different SPM configurations. Our approach achieves up to 21% and 43% improvements in WCET reduction over the ILP-based approach and the greedy approach, respectively.
Article
When designing hard real-time embedded systems, it is required to estimate the worst-case execution time (WCET) of each task for schedulability analysis. Precise cache persistence analysis can significantly tighten the WCET estimation, especially when the program has many loops. Methods for persistence analysis should safely and precisely classify memory references as persistent. Existing safe approaches suffer from multiple sources of pessimism and may not provide precise results. In this paper, we first identify some sources of pessimism that two recent approaches based on younger set and may analysis may encounter. Then, we propose two methods to eliminate these sources of pessimism. The first method improves the update function of the may analysis-based approach; and the second method integrates the younger set-based and may analysis-based approaches together to further reduce pessimism. We also prove the two proposed methods are still safe. We evaluate the approaches on a set of benchmarks and observe the number of memory references classified as persistent is increased by the proposed methods. Moreover, we empirically compare the storage space and analysis time used by different methods.
Article
Real-time critical systems can be considered as correct if they compute both right and fast enough. Functionality aspects (computing right) can be addressed using high level design methods, such as the synchronous approach that provides languages, compilers and verification tools. Real-time aspects (computing fast enough) can be addressed with static timing analysis, that aims at discovering safe bounds on the worst-case execution time (WCET) of the binary code. In this paper, we aim at improving the estimated WCET in the case where the binary code comes from a high-level synchronous design. The key idea is that some high-level functional properties may imply that some execution paths of the binary code are actually infeasible, and thus, can be removed from the worst-case candidates. In order to automatize the method, we show (1) how to trace semantic information between the high-level design and the executable code, (2) how to use a model-checker to prove infeasibility of some execution paths, and (3) how to integrate such infeasibility information into an existing timing analysis framework. Based on a realistic example, we show that there is a large possible improvement for a reasonable computation time overhead.
Article
The C language does not provide any abstractions for exception handling or other forms of error handling, leaving programmers to devise their own conventions for detecting and handling errors. The Linux coding style guidelines suggest placing error handling code at the end of each function, where it can be reached by gotos whenever an error is detected. This coding style has the advantage of putting all of the error-handling code in one place, which eases understanding and maintenance, and reduces code duplication. Nevertheless, this coding style is not always applied. In this paper, we propose an automatic program transformation that transforms error-handling code into this style. We have applied our transformation to the Linux 2.6.34 kernel source code, on which it reorganizes the error handling code of over 1800 functions, in about 25 minutes.
Article
This paper presents an integration method of AUTOSAR-compliant ECUs which can evaluate resource constraints in an early-stage of development. There are three types of resources for an ECU (timing, memory, and interface) which should be carefully managed for successful ECU integration. The proposed method consists of three steps: measurement, prediction, and evaluation. In the first step, a method to measure resource factors for AUTOSAR-compliant software architecture is introduced. Based on the method, a worst-case execution cycle of a runnable, memory section usages of a software component, and interface of legacy ECUs can be obtained. In the second step, the obtained factors are quantitatively predicted according to the architectural designs of the integration ECU. In the case of the timing resource, the worst-case execution time of the integration ECU can be precisely predicted by a proposed empirical model. In the last step, the resource constraints such as CPU, memory, network utilizations can be evaluated with predicted resource factors before implementation. The proposed method was applied to the integration of an in-house engine management system composed of two ECUs. The method successfully provided quantitative measures to evaluate architectural designs of three different integration scenarios.
Article
In this paper, we discuss software design issues related to the development of parallel computational intelligence algorithms on multi-core CPUs, using the new Java 8 functional programming features. In particular, we focus on probabilistic graphical models (PGMs) and present the parallelization of a collection of algorithms that deal with inference and learning of PGMs from data. Namely, maximum likelihood estimation, importance sampling, and greedy search for solving combinatorial optimization problems. Through these concrete examples, we tackle the problem of defining efficient data structures for PGMs and parallel processing of same-size batches of data sets using Java 8 features. We also provide straightforward techniques to code parallel algorithms that seamlessly exploit multicore processors. The experimental analysis, carried out using our open source AMIDST (Analysis of MassIve Data STreams) Java toolbox, shows the merits of the proposed solutions.
Article
When designing hard real-time embedded systems, it is required to estimate the worst-case execution time (WCET) of each task for schedulability analysis. Precise cache persistence analysis can significantly tighten the WCET estimation, especially when the program has many loops. Methods for persistence analysis should safely and precisely classify memory references as persistent. Existing safe approaches suffer from multiple sources of pessimism and may not provide precise results. In this paper, we first identify some sources of pessimism that two recent approaches based on younger set and may analysis may encounter. Then, we propose two methods to eliminate these sources of pessimism. The first method improves the update function of the may analysis-based approach; and the second method integrates the younger set-based and may analysis-based approaches together to further reduce pessimism. We also prove the two proposed methods are still safe. We evaluate the approaches on a set of benchmarks and observe the number of memory references classified as persistent is increased by the proposed methods. Moreover, we empirically compare the storage space and analysis time used by different methods.
Article
Real-Time systems require a safe and precise estimate of the worst-case execution time (WCET) of programs. In multicore architectures, the precision of a program's WCET estimate highly depends on the precision of its predicted shared cache behavior. Prediction of shared cache behavior is difficult due to the uncertain timing of interfering shared cache accesses made by programs running on other cores. Given the assignment of programs to cores, the worst-case interference placement (WCIP) technique tries to find the worst-case timing of interfering accesses, which would cause the maximum number of cache misses on the worst case path of the program, to determine its WCET. Although WCIP generates highly precise WCET estimates, the current ILP-based approach is also known to have very high analysis time. In this work, we investigate the WCIP problem in detail and determine its source of hardness. We show that performing WCIP is an NP-hard problem by reducing the 0-1 knapsack problem. We use this observation to make simplifying assumptions, which make theWCIP problem tractable, and we propose an approximate greedy technique for WCIP, whose time complexity is linear in the size of the program.We perform extensive experiments to show that the assumptions do not affect the precision of WCIP but result in significant reduction of analysis time.