Figure 5 - uploaded by Dieter K. Hammer
Content may be subject to copyright.
Selective and exhaustive approaches. There are a number of important factors to be taken into account when choosing a particular approach.
Source publication
Early assessment of the compositional properties of component compositions is one of the hottest issues in component-based architecting. We describe a method for evaluating the static properties of an architecture, given the features of its constituents. The estimation framework is based on composition rules and the specification of the static prop...
Contexts in source publication
Context 1
... the exhaustive approach, all the diversity parameters of all components are taken into account (see Figure 5). The component hierarchy is traversed in a bottom-up way, starting from the basic components up to ones at the defined level of the hierarchy. ...
Context 2
... selective approach deals only with the components and diversity parameters that are sufficient for achieving a desired level of precision. The approach starts at some fixed level of the composition hierarchy (in Figure 5, e.g. components C1, C2, and C3) and traverses the component hierarchy in a top-down way. ...
Similar publications
Successful captive breeding programs are crucial to the long-term survival of many threatened species. However, pair incompatibility (breeding failure) limits sustainability of many captive populations. Understanding whether the drivers of this incompatibility are behavioral, genetic, or a combination of both, is crucial to improving breeding progr...
Koalas are rescued from the wild often with incidence of burns from bushfire, injury from animal attacks, vehicle collision and diseases. Exposure to environmental stressors (trauma and disease) could generate physiological stress and potentially impact the outcomes of clinical management intervention and rehabilitation of rescued wild koalas. It i...
Kimberlites are the deepest melts produced on Earth that are erupted at the surface and can therefore provide unique insights into the composition and evolution of the mantle. Radiogenic isotopes provide ambiguous evidence for the occurrence of recycled crustal material in kimberlite sources. Oxygen isotopes can fractionate significantly only in th...
Conservation planning and population assessment for widely-distributed, but vulnerable, arboreal folivore species demands cost-effective mapping of habitat suitability over large areas. This study tested whether multispectral data from WorldView-3 could be used to estimate and map foliar digestible nitrogen (DigN), a nutritional measure superior to...
Citations
... The ontology development sub-process is regarded as an important preparatory part, but also as a complementary sub-process of SMF definition [15]. Its preparatory and complementary nature comes from the assumption that initial definition of SMFs is based on various concept and entity ontologies. ...
... In the first step, a composition of ontological concepts is used to define alternative physically coherent and feasible structures for SMF genotypes. The genotype creates multiple structural relations among semantically compatible ontological concepts in order to form indeed physically coherent structures [15]. The second step of the sub-process introduces structural parametrization over the structures captured within a genotype. ...
... The integration of (abstractions of ) QoS properties into component models is supported several component-based approaches and tools, such as KLAPER [GMRS07], Palladio [BKR07] and RoboCop [FEHC02]. As these component models do not dene any renement notion, they are clearly distinguishable form our work. ...
Several scientific bottlenecks have been identified in existing component-based approaches. Among them, we focus on the identification of a relevant abstraction for the component expression and verification of properties like substitutivity: when is it possible to formally accept or reject the substitution of a component in a composition? This paper suggests integer weighted automata to tackle this problem when considering a new factor — Quality of Service (QoS). Four notions of simulation-based substitutivity managing QoS aspects are proposed, and related complexity issues on integer weighted automata are investigated. Furthermore, the paper defines composition operators: sequential, strict-sequential and parallel compositions, bringing path costs into the analysis. New results on the compatibility of proposed substitutivity notions w.r.t. sequential and parallel composition operators are established.
... Resources consumption is exhibited by IResource interfaces and is considered constant per operation [23]. An evaluation mechanism to measure static memory consumption had been developed for Koala [9]. ...
... The Mbox, Queue, Sem and Mutex modules are gathered under the IPC appellation (for Inter-Process Communication).9 For example, the Core module makes 28 function calls to the Task module, and sums a total of 36 calls with all the modules. ...
As embedded systems must constantly integrate new functionalities, their developement cycles must be based on high-level abstractions, making the software design more flexible. CBSE provides an approach to these new requirements. However, low-level services provided by operating systems are an integral part of embedded applications, furthermore deployed on resource-limited devices. Therefore, the expected benefits of CBSE must not impact on the constraints imposed by the targetted domain, such as memory footprint, energy consumption, and execution time. In this paper, we present the componentization of a legacy industry-established Real-Time Operating System, and how component-based applications are built on top of it. We use the Think framework that allows to produce flexible systems while paying for flexibility only where desired. Performed experimentions show that the induced overhead is negligeable.
... In Koala there is a mechanism for specifying static EFPs, concretely resource usage (for instance static memory [31]). There is also support for compile-time checks of resources. ...
... The importance of resource awareness in embedded systems is growing rapidly [6, 12, 13, 14, 18, 19, 21]. The limited availability of computing resources is preventing the introduction of new product features and applications, especially in areas where high-performance embedded systems are required. ...
... First, research has been devoted to code-level resource modeling and analysis, in component assemblies. In Koala [13] and Robocop [12] component frameworks, static memory estimation has been performed for applications in which the instantiated components of a composition are known prior to run-time. Such low-level code-driven resource estimates can only be used in cases when one has access to the components implementations. ...
In this paper, we introduce the model REMES for formal modeling and analysis of embedded resources such as stor- age, energy, communication, and computation. The model is a state-machine based behavioral language with support for hierarchical modeling, resource annotations, continu- ous time, and notions of explicit entry and exit points that make it suitable for component-based modeling of embed- ded systems. The analysis of REMES-based systems is centered around a weighted sum in which the variables represent the amounts of consumed resources. We describe a number of important resource related analysis problems, including feasibility, trade-off, and optimal resource-utilization anal- ysis. To formalize these problems and provide a basis for rigorous analysis, we show how to analyze REMES models usingtheframeworkofpriced timed automataandweighted CTL. To illustrate the approach, we describe a case study in which it has been applied to model and analyze resource- usage of a temperature control system.
... Code-level memory estimation for, e.g. [8, 9] and Robocop-based [12] compositions, as well as higher-level formal approaches [4, 7, 13, 17] aim to establish whether certain resourcerelated properties hold for a system model. The main problem of building an ES is correlating its various models of different degrees of detail, which are related via abstraction or refinement. ...
... The importance of predicting resource consumption of component assemblies has motivated many researchers to investigate the issue. Compositional ways of estimating the static memory consumption of Koala-based embedded system models are already here to help us live up to the resource prediction challenge [8, 9]. Koala [22] is a software component model, introduced by Philips Electronics, designed to build product families of consumer electronics. ...
The conflicting requirements of real-time embedded systems, e.g. minimizing memory usage while still ensuring that all deadlines are met at run-time, require rigorous analysis of the system's resource consumption, starting at early design stages. In this paper, we glance through several representative frameworks that model and estimate resource usage of embedded systems, pointing out advantages and limitations. In the end, we describe our own view on how to model and carry out formal analysis of embedded resources, along with developing the system.
... For consumer electronics devices, Philips has developed and has been using an architectural description language to build products like TVs, VCRs, recorders and combinations using a component model called Kaola [54], [55], [56]. ...
Critical systems in areas ranging from avionics to consumer car control systems are being built by integrated commercial-off-the-shelf (COTS) components. Software components used in these systems need to satisfy many formally unexpressed, yet necessary conditions, termed as assumptions, for their correct functioning. Invalid assumptions have been determined to be the root cause of failures in many such systems; for example, in the Ariane 5 rocket failure. In the current software engineering practices, many of these assumptions are not recorded in a machine-checkable format, which makes validating the assumptions a manual and an error-prone task. This thesis examines this problem in detail and evolves a framework, called the assumptions management framework (AMF), which provides a vocabulary for discussing assumptions, a language for encoding assumptions in a machine-checkable format and facilities to manage the assumptions in terms of composition and setting policies on assumption validation. A relevant subset of assumptions can be validated or flagged as invalid automatically as the system evolves. AMF allows the assumption specification process to blend with the components. source-code and architecture specification. This enables AMF to be applied to existing systems with minor or no modifications in components. implementation and design. Performance and scalability tests show that the AMF implementation is scalable to be applied to large-scale systems. Case-studies were conducted on representative systems to study the nature and number of defects caused by invalid assumptions. It was found that a significant number of defects in the systems studied had invalid assumptions as the root-cause. It was found that AMF has the ability to encode and validate majority of the assumptions that cause defects in these systems. This can prevent such defects in the future or warn in advance of potential defects when assumptions are invalid. Analyzing and correcting one of the invalid assumptions in Iperf, an end-to-end bandwidth measurement tool, resulted in significantly better bandwidth estimates by Iperf across high-bandwidth networks. In most cases, it also resulted in savings of over 90% in terms of both network traffic generated and bandwidth measurement times.
... Of course, these are all simple examples of reflection (see [5] for more examples). The real power becomes evident when building large systems. ...
Consumer products are becoming increasingly software intensive. The software complexity of individual products grows, while the diversity of products increases and the lead time must decrease. Software reuse is the answer to this, not only within a family but also between families of consumer products. We have devised an approach based upon a software component technology to enable reuse. This paper describes that approach, and it zooms in on two important aspects of component-based development. One aspect concerns the prediction of system properties from properties of components, which we illustrate using thread synchronization as example. The other aspect concerns branching of our software in our configuration management systems, where our analysis leads to the discovery that we may be constantly rewriting our own code and to the definition of the turn-over factor to quantify this. We end this paper with a brief validation of our approach.
... But analysis is also dependent on the assigned attributes (e.g., for schedulability analysis, WCET of the different tasks are needed). Examples of analysis include schedulability analysis [1], memory consumption analysis [5], and reliability analysis [20]. Attributes that are usage and environment dependent cannot be analysed in this automated step, since it only relies on information from the component model. ...
Component-based software engineering is a technique that has proven effective to increase reusability and efficiency in development of office and Web applications. Though being promising also for development of embedded and dependable systems, the true potential in this domain has not yet been realized. In this paper, we present a prototype component technology developed with safety-critical automotive applications in mind. The technology is illustrated by a case-study, which is also used as the basis for an evaluation and a discussion of the appropriateness and applicability in the considered domain. Our study provides initial positive evidence of the suitability of our technology, but also shows that it needs to be extended to be fully applicable in an industrial context.
... Regarding specifically systems-oriented component models, the following are some major players: Knit [8], Koala [16], MMLite [10] and THINK [9]. Of these, Knit and Koala are build-time component models: i.e. components are not visible at runtime, so there is no systematic support for dynamic component loading, still less managed reconfiguration. ...
OpenCOM v2 is our experimental language-independent component-based systems-building technology. OpenCOM offers more than merely a component-based programming model. First, it is a runtime component model and supports dynamic runtime reconfiguration of systems (i.e. one can load, unload, bind, and rebind components at runtime). Second, it explicitly supports the deployment of the model in a wide range of `deployment environments' (e.g. operating systems, PDAs, embedded devices, network processors). Third, it allows the particularities of different deployment environments to be selectively hidden from/ made visible to the OpenCOM programmer without inherent performance overhead.