Conference PaperPDF Available

A Support Tool for Domain Analysis.

Authors:

Abstract

Nowadays, companies need to improve their com- petitiveness. Thus, they seek systematic ways of adopting software reuse, and domain analysis is one possibility to reach it. However, it involves the management and analysis of a large set of interrelated information from several systems. Hence, due to its complexity, a support tool is necessary. This paper, presents a tool called ToolDAy, which aims at making the process semi-automatic and at aiding the domain analyst to achieve systematic reuse in an effective way. In addition, its evaluations are also described.
A Support Tool for Domain Analysis
Liana Barachisio Lisboa
RiSE - Reuse In Software Engineering
Bahia - Brazil
Email: liana@rise.com.br
Vinicius Cardoso Garcia
Silvio Romero de Lemos Meira
RiSE and Federal University of Pernambuco
Email: {vcg,srlm}@cin.ufpe.br
Eduardo Santana de Almeida
RiSE and Federal University of Bahia
Bahia - Brazil
Email: esa@dcc.ufba.br
Abstract—Nowadays, companies need to improve their com-
petitiveness. Thus, they seek systematic ways of adopting
software reuse, and domain analysis is one possibility to reach
it. However, it involves the management and analysis of a large
set of interrelated information from several systems. Hence,
due to its complexity, a support tool is necessary. This paper,
presents a tool called ToolDAy, which aims at making the
process semi-automatic and at aiding the domain analyst to
achieve systematic reuse in an effective way. In addition, its
evaluations are also described.
Keywords-Domain Analysis Tool, ToolDAy, Evaluation
I. INTRODUCTION
Nowadays, companies are seeking for ways to improve
their competitiveness, which involves less time-to-market
and high quality for products. The adoption of software reuse
is an option to obtain these benefits. Although the benefits
of software reuse are promising, it is a complex task to put
it into practice. A way to maximize these possible benefits
is through a systematic reuse approach, which is domain
focused, based on a repeatable process, and concerned with
reuse of higher level life cycle artifacts [1]. One of the
ways to accomplish this is through a domain analysis (DA)
process, which is the process of identifying common and
variable characteristics of systems in a specific domain.
The DA process is composed of some interdependent
activities that involve the management of complex and
interrelated information from various sources. Due to this,
the use of human expertise in industrial projects without
automation can contribute to risks in a project.
The development of ToolDAy was based on a system-
atic review of DA tools that analyzed how existing tools
supported the DA process [2]. In this review, the authors
analyzed nineteen relevant tools to extract the results.
From the results, it was identified that tools usually come
from the necessity of supporting a specific process instead
of a generic one. However, this may force companies to
modify or adapt established development processes, which
can lead to a higher learning curve and a bigger impact on
the company development life cycle [2].
Another outcome was the identification of a set of func-
tionalities that any tool should have. They were extracted
from the selected tools and grouped into phases, which were:
(i) Planning, analyze systems to see what is valid or not
Table I
FUNCTIONALITIES GROUPED BY PHASE WITH THEIR PRIORITIES
Functionality Priority
Planning phase
Pre Analysis Documentation Low
Domain Matrix Essential
Evaluation Functions Low
Scope Definition Important
Modeling phase
Domain Representation Essential
Variability Essential
Mandatory Features Essential
Composition Rules Essential
Feature Group Identification Low
Relationship Types Low
Feature Atributes Low
Validation phase
Domain Documentation Essential
Feature Documentation Important
Requirements Management Important
Relationship between Features and Requirements Low
Dictionary Important
Reports Important
Consistency Check Essential
Product Derivation
Product Derivation Important
Product Documentation Important
to be included in the domain scope; (ii) Modeling, model
the defined domain in a visual way; and (iii) Validation,
document and validate the generated artifacts.
Furthermore, functionalities for the product derivation
were also identified in the majority of tools. Finally, a total
of twenty functionalities were recognized, which are shown
in Table I with their priorities. With these analyses, the
reviewers identified that there is not a tool that focus on
all phases, and the majority of them offer support, mainly,
to the modeling phase.
This tool development also came from industrial needs
that were experienced along five years of projects involving
software reuse at C.E.S.A.R1, a Brazilian Innovation Insti-
tute with CMMI level 3.
This paper presents ToolDAy, which goal is to support the
DA process, making it semi-automatic and aiding the analyst
to achieve systematic reuse in an effective way. This work
1Recife Center for Advanced Studies and Systems - http://www.cesar.
org.br
VaMoS 2010
175
extends a previous one [3], in which the main functionalities
of the tool (Table I) were described. This paper makes two
contributions, (i) describing the second iteration of ToolDAy
and (ii) reporting two evaluations.
II. TOOLDAY
Due to lack of space, several artifacts are just mentioned
herein, to see the artifacts and some screen shots go to http:
//www.cin.ufpe.br/vcg/toolday.
The DA process starts with the planning phase. ToolDAy
provides a set of documentation fields for pre-analysis doc-
umentation that aids in the identification of what character-
istics should be in the domain. The documentation includes:
identifying the stakeholders; objectives and constraint defi-
nition; market analysis and data collection.
Then, the domain scope can be defined through the
product map (domain matrix in Table I) that relates and
compares characteristics of domain applications, extracted
from pre-analysis documentation, to identify which ones
should be part of the domain through some metrics, called
evaluation functions. Their results, which can be mandatory,
variable or out of the scope, influence the domain scope.
In the modeling phase, ToolDAy performs the domain
representation with a feature model, in which the features
are diagrams and their types are relationships (which can be
alternative, or, optional and mandatory, plus the composition
rules that can be implication and exclusion).
Also regarding the relationships between features, they
can be represented in the model with different line formats.
The types are composition, used if there is a whole-part
relationship; generalization, when features are generalization
of sub-features; and implementation, when a feature imple-
ments the other feature. There is also the default relationship
that has no type.
Besides, features can be grouped according to the infor-
mation they represent. The groups are identified through
different colors in the feature border. They can be: capabil-
ity, characterizes a distinct service or functionality a product
may have; operating environment, represents an environment
attribute in which the product is used; domain technology,
corresponds to a domain specific implementation; and im-
plementation technique, implementation details that can be
reused cross domains.
ToolDAy also implements the inclusion of attributes for
the features. They synthesize the representation of a large
number of possible variations improving the understandabil-
ity of the feature model.
For the validation phase, ToolDAy provides a large
set of documentation. Apart from the domain and features
documentation [3], ToolDAy permits the description of re-
quirements and use cases, which are optional artifacts in
project execution.
The requirement documentation includes priority, type
(functional or non-functional) and description, while the use
case includes pre and post conditions and the execution
flows: main, alternative and exception.
After specifying the requirements, use cases and features,
it is possible to map the traceability among them. This fa-
cilitates the identification of the impact of one modification.
This traceability is done in a visual model with different
diagrams shapes for each artifact.
The consistency checker verifies if the relationships be-
tween features are correct. ToolDAy’s consistency rules are
divided in three categories: redundancy, the same semantic
information is represented in more than one way. Anomalies,
some features configurations are lost and the domain cannot
be completely configurable. And, inconsistency, some rep-
resentation contradicts with other information in the model.
Each category has a set of verifications [4].
Moreover, ToolDAy also supports the inclusion of a dic-
tionary, which purpose is to clarify the terms of the domain.
Furthermore, there is no advantage in proving a large set
of documentation, if they were only available within the
tool environment. Thus, to provide their visualization in
other environments, ToolDAy permits the creation of several
reports in different formats (PDF, Excel or Images).
The product derivation with ToolDAy is done through
the selection of the domain features. This selection occurs
in a tree view representing the domain hierarchy. There
is also a consistency checker for the product, but it has
different validation rules [4] and all of them are classified as
inconsistencies. Once the product is valid, the product model
can be generated. In this model all features are mandatory
and the user can include new features, usually the ones
marked as ”out of scope” in the product map. Each product
has a simple documentation that includes the domain version
it was based on and its description.
III. EVA L UA T I O N
To evaluate the described tool, two case studies were
performed. The first was in a controlled environment, while
the second was in a software company.
A. Controlled Environment
The case study followed guidelines from [5]. This study
was performed before the second development iteration.
The goal of the study was to analyze ToolDAy with
respect to its aid in the DA execution and easy to use
environment. To achieve it, some questions were defined
for the subjects. They were, (Q1) if ToolDAy aids in the
execution; (Q2) if there were any difficulties using the tool;
(Q3) if the consistency checker is helpful; and (Q4) if
ToolDAy tutorial is enough to learn how to use it.
The study was conducted in a post-graduation course at
a university lab from November, 2007 to February, 2008
by the students that performed a domain engineering (DE)
project based on a real-world case.
VaMoS 2010
176
The project consisted of four reuse tools, which provided
solutions to increase the organization productivity through
reuse according to its maturity level. At the end of the
DA, they generated a feature model with 64 features, 14
requirements and 38 use cases.
The subjects were six students that played the domain
analyst role. All of them had worked before with the same
platform ToolDAy uses (Eclipse2) and half of them knew
some DA processes, while the other half knew just one.
After the project was concluded, the subjects were asked
to fulfill a feedback form with questions related to it. Their
answers are described next.
Q1: Four subjects considered that the tool aided in the
process execution, another judged that the tool did not help
a lot during the process execution, and the other did not see
the gain in using the tool.
The reasons given by the two subjects for not considering
the tool helpful were: great part of the process can be done
without tool support; lack of integration with the next steps
of DE; and the tool is not completed integrated with the
DA process used. However, ToolDAy focuses is on the DA
process support, a few steps of the product derivation, and
on generating the artifacts that will be later used on the DE
phases, it does not intend to integrate the complete process.
The other subjects explained why they considered the tool
helpful: it helps the process execution steps; and it aids in
the scope definition and in the domain modeling.
They also described some weakness and strengths about
the tool. The weaknesses were: traceability among require-
ments, use cases and features is too simple (in the evalu-
ation, requirements and use cases only had the name and
description and the traceability editor did not exist); lack
of requirement management; and a model with too many
features becomes too polluted. Some of the strengths were
the consistency checker; generation of reports; and the visual
representation of the domain model.
Q2: Only one subject did not have difficulty with the
tool. The other answers were (mutiple answers per subject):
Two said that the navigation is hard (both are familiar with
the platform and some DA processes). Two answered that
there was lack or insufficient explanation for using the tool
(both are familiar with the platform, but one knows few DA
processes and the other just one). A subject related the lack
of knowledge in the DA process (he knows only one DA
process). In addition, a few subjects also informed that the
difficulty occurred because of the symbols used to represent
the relationships and the traceability between the domain
feature model and the product map.
Even though several subjects reported at least a difficulty,
they were mostly related to GUI or to specific aspects of
traceability (which have already being improved) and not to
the main functionalities of the tool.
2http://www.eclipse.org
Q3: Only one subject informed that the consistency
checker did not fully aid the problem identification. The
restriction was the lack of an easier identification of where
the problem is, i.e. the tool should select the exact spot of the
problem. The others considered it sufficient for identifying
and resolving the problems.
Q4: Four subjects considered the information of the
tutorial sufficient for learning how to use the tool and the
other two subjects did not use it. In addition, one that did not
use it was the same who declared that had some difficulty
due to lack or insufficient explanation for using the tool.
Therefore, it indicates that the tutorial may overcome this.
Even with the analysis not being conclusive, the study
indicated that the tool has some strength for the DA sup-
port. On the other hand, aspects related to understanding
(difficulties during the process execution) were the focus of
the second iteration.
Nevertheless, some of the problems described by the
subjects can be resolved if a proper training, before the tool
starts to be used, is performed. After concluding the study,
some aspects should be considered before repeating it.
Training. Instead of the subjects learning how to use it
through the tutorial, a basic training can be applied. The
training can emphasize on unused aspects by the subjects of
this study and on the complaints related to it.
Questionnaires. The questionnaires should be reviewed
to collect more precise data related to where the problems
and difficulties occurred.
According to the feedback, some requirements were iden-
tified and developed in the second iteration. The consistency
checker now selects the exact spot where the problem is.
The requirements and use cases (as described before) are
more detailed and permit the traceability with features. The
traceability model can be exported as an excel document.
Additionally, some visualization filters to models with too
many features were created.
The other improvements from the questionnaires, such
as import the features added in the domain feature model
to the product map and search for features have not been
implemented yet.
B. Industrial Case Study
The industrial case was developed at C.E.S.A.R. and it
is part of the company software reuse effort, as a way to
institutionalize reuse in all of its projects.
The business goals defined for the project were: (a)
increase the productivity and (b) reduce maintenance costs
and development efforts. The pilot project selected was a
web/social network with seven different releases. The project
goal was to adopt a software product line with the benefits
of: (i) better understandability of the project business; (ii)
identification of new market opportunities; (iii) identification
of new functionalities for the product; and (iv) decrease the
maintenance cost.
VaMoS 2010
177
The team goal for the DA phase was to identify existing
features from other tools that were not present in their tool.
They performed the DA documenting the domain, defining
its scope and building a glossary. They created the product
map of the analyzed applications and the features model of
domain with 74 features.
At the end, the team considered that the DA goal was
achieved. Moreover, a better understanding of the domain
being developed was accomplished, since several new fea-
tures were identified and planned to be developed.
The domain analyst did not have any formal DA process,
therefore he followed ToolDAy steps and had no difficulty
in using it. However, improvements were highlighted, such
as the possibility to export the product map as a table.
Even though the goal was achieved and the process result
brought benefits to the team, for the domain analysts there
was no real data that using ToolDAy during the DA process
aided it. However, it is necessary to highlight that since the
user did not follow a specific process, the tool contributed
in the process execution because it provided a guideline to
define the domain scope, its feature model and to document
them. Furthermore, this industrial case worked as a proof of
concept for the tool.
IV. RELATED WORK
The systematic review [2] identified nineteen tools sup-
porting the DA processes. Some concentrate on the planning
phase - like PuLSE-BEAT [6] and DREAM [7] - while
others on the modeling - such as RequiLine [8]. The support
for the validation phase differentiates among them, from
almost all to none support. Two tools outstand in the review
according to the number of essential requirements they
support, Holmes and RequiLine.
Holmes [9] provides functionalities for all identified
phases. It permits pre-analysis, domain documentation, com-
position rules support and validation, along with the domain
scope and a visual representation of the domain. Even
though it supports product instantiation, no documentation
is provided to it and to the features.
RequiLine [8] supports almost all functionalities for the
modeling phase, except feature group identification. It pro-
vides a vast documentation for the domain and features, and
permits the inclusion of requirements that can be related
with the features in the domain. Besides, it supports product
derivation and documentation.
Among the analyzed tools, none of them provides a full
support to the process. The two closer to it still lack few
functionalities. ToolDAy development was planned focusing
on this problem and its development, according to results of
the review, is fulfilled.
V. C ONCLUSION
This paper presented ToolDAy, a tool that offers support
to several requirements within the process, which includes
scope definition, domain modeling, documentation and con-
sistency. Furthermore, ToolDAy also supports requirements
for the product derivation.
The support to requirements in every phase and having the
whole support in the same environment is one of ToolDAy
advantages when compared to other DA tools. Even though
it has a defined process, it can be used without following
it, since the artifacts of planning and modeling (domain and
product) can be independently built.
The evaluations highlighted improvements that are being
developed. Furthermore, they indicate that the tool aids in
the process support, but new studies are necessary. For future
work, ToolDAy is being extended with feature interactions;
metrics regarding the domain information and different
visualization views for the domain representation.
ACKNOWLEDGMENT
This work was partially supported by the National Institute of Science
and Technology for Software Engineering (INES3), funded by CNPq and
FACEPE, grants 573964/2008-4 and APQ-1037-1.03/08.
REFERENCES
[1] W. B. Frakes and S. Isoda, “Success factors of systematic
reuse,” IEEE Software, vol. 11, no. 5, pp. 14–19, 1994.
[2] L. B. Lisboa, D. Lucrdio, V. C. Garcia, E. S. Almeida,
S. R. L. Meira, and R. P. M. Fortes, “A systematic review of
domain analysis tools,” Journal of Information and Software
Technology, vol. 52, pp. 1–13, 2010.
[3] L. B. Lisboa, V. C. Garcia, E. S. Almeida, and S. L. Meira,
“Toolday - a process-centered domain analysis tool, in Brazil-
ian Symposium on Software Engineering - Tools Session,
Brazil, 2007, pp. 54–60.
[4] T. v. d. Massen and H. Lichter, “Deficiencies in feature
models,” in Workshop on Software Variability Management for
Product Derivation, EUA, 2004.
[5] C. Wohlin, P. Runeson, M. Host, M. C. Ohlsson, B. Regnell,
and A. Wessln, Experimentation in Software Engineering: An
Introduction. Kluwer Academic Publishers, 2000.
[6] K. Schmid and M. Schank, “Pulse-beat a decision support
tool for scoping product lines,” in Software Architectures for
Product Families, Spain, 2000, pp. 65–75.
[7] M. Moon, K. Yeom, and H. S. Chae, “An approach to develop-
ing domain requirements as a core asset based on commonality
and variability analysis in a product line, IEEE Transactions
on Software Engineering, vol. 31, no. 7, pp. 551–569, 2005.
[8] T. v. d. Massen and H. Lichter, “Requiline: A requirements
engineering tool for software product lines,” in International
Workshop on Product Family Engineering. Italy: Springer
Verlag, 2003, pp. 168–180.
[9] G. Succi, J. Yip, E. Liu, and W. Pedrycz, “Holmes: a system
to support software product lines,” in International Conference
on Software Engineering. Ireland: ACM, 2000, p. 786.
3INES - http://www.ines.org.br
VaMoS 2010
178
... ToolDAy had three evaluations, two controlled studies and an industrial case. The first study and the industrial case had already been described in [20], however they are presented here with more details. ...
Article
Full-text available
Domain analysis is the process of identifying and documenting common and variable characteristics of systems in a specific domain. This process is a large and complex one, involving many interrelated activities, making it essential to have a tool support for aiding the process. We present a domain analysis tool called ToolDAy that has the purpose of making the process semi-automatic. The requirements definition presented were based on the results of a systematic review that analyzed several existing tools. Furthermore, this article describes the tool architecture, implementation and its evaluations (two as a controlled experiment and one as an industrial case study) with three different domains. The results of these evaluations indicate that the tool can aid the domain analyst to achieve systematic reuse in an effective way. KeywordsToolDay–Domain analysis–Tool–Software reuse
... In the product line domain, there are some tools that provide consistency checking functions. ToolDAy [27] is a product line management tool that guides activities such as scope definition, domain modeling, documentation, consistency checking, and product derivation. SPLOT [9] is a Web-based reasoning and configuration system for cardinality-based feature models. ...
Conference Paper
Full-text available
Developing high quality systems depends on developing high quality models. An important facet of model quality is their consistency with respect to their meta-model. We call the verification of this quality the conformance checking process. We are interested in the conformance checking of Product Line Models (PLMs). The problem in the context of product lines is that product models are not created by instantiating a meta-model: they are derived from PLMs. Therefore it is usually at the level of PLMs that conformance checking is applied. On the semantic level, a PLM is defined as the collection of all the product models that can be derived from it. Therefore checking the conformance of the PLM is equivalent to checking the conformance of all the product models. However, we would like to avoid this naïve approach because it is not scalable due to the high number of models. In fact, it is even sometimes infeasible to calculate the number of product models of a PLM. Despite the importance of PLM conformance checking, very few research works have been published and tools do not adequately support it. In this paper, we present an approach that employs Constraint Logic Programming as a technology on which to build a PLM conformance checking solution. The paper demonstrates the approach with feature models, the de facto standard for modeling software product lines. Based on an extensive literature review and an empirical study, we identified a set of 9 conformance checking rules and implemented them on the GNU Prolog constraints solver. We evaluated our approach by applying our rules to 50 feature models of sizes up to 10000 features. The evaluation showed that our approach is effective and scalable to industry size models.
Article
Full-text available
Feature models (FMs) appeared more than 30 years ago, and they are valuable tools for modeling the functional variability of systems. The automated analysis of feature models (AAFM) is currently a thriving, motivating, and active research area. The product configuration of FMs is a relevant and helpful operation, a crucial activity overall with large-scale feature models. The minimal conflict detection, the diagnosis of in-conflict configuration, and the product completion of consistent partial configuration are significant operations for obtaining consistent and well-defined products. Overall, configuring products for large-scale variability intensive systems (VIS) asks for efficient automated solutions for minimal conflict, diagnosis, and product configuration. Given the relevance of minimal conflict, diagnosis, and product configuration, and the current application of large-scale configuration and FMs for representing those systems and products, the main goals of this research paper are to establish the fundaments of the product configuration of feature models and systematically review existing solutions for the conflict detection, diagnosis, and product completion in FMs from 2010 to 2019. We can perceive that even though modern computing approaches exist for AAFM operations, no solutions exist for assisting the product configurations before 2020. This article reports that in 2020, new solutions appear regarding applying parallel computing for those goals. This research highlights research opportunities for developing new and more efficient solutions for conflict detection, diagnosis, and product completion of large-scale configurations.
Conference Paper
Full-text available
Domain analysis is the process of mapping domain specific systems in order to identify their commonalities and variabilities for aiding a software development lifecycle with reuse. This process, however, involves several phases with inputs, outputs, and guidelines, so it is important to have a tool support for aiding it. This paper presents a Process-Centered Domain Analysis Tool, called ToolDAy, which goal is to facilitate the domain analysis process making it more automated.
Conference Paper
Full-text available
Software product lines are characterized through common and vari-able parts. Modeling variability is one of the most important tasks during the analysis phase. Domain analysis and requirements elicitation bring up a huge amount of requirements and dependencies between product characteristics. Fea-ture modeling is one approach to deal with complexity in expressing several re-quirements in features and structure them hierarchically in feature diagrams. Unfortunately these feature models become very complex as well. Therefore it is necessary to develop and maintain feature models very carefully with respect to redundancy and consistency. As feature models are not only used for domain modeling, but for product derivation in product line development as well, in-consistent feature models will limit the chance to build consistent product con-figurations. Hence, it must be defined, what is meant by consistency and redun-dancy in the context of feature models. Experiences show that an adequate tool support is needed to manage the feature models and to support automatic detec-tion of redundancies and inconsistent models and product derivations. Our re-search group has developed a prototype of a requirements engineering tool that supports feature modeling and provides automatic consistency checks.
Conference Paper
Full-text available
The paper discusses Holmes tool designed to support the Sherlock (Predonzani et al., 2000) software product line methodology. Holmes attempts to provide comprehensive support for product line development, from market and product strategy analysis to modeling, designing, and developing the resulting system. The paper shows the overall architecture of Holmes. It centres on the use of JavaSpaces as a distributed blackboard of objects
Conference Paper
Software Product Lines are characterized through common and variable parts. Modeling variability is one of the most important tasks during the analysis phase. Domain analysis and requirements elicitation will bring up a huge amount of requirements and dependencies between product characteristics. Feature modeling is one approach to deal with complexity in expressing several requirements in features and structure them hierarchically in feature diagrams. Unfortunately the requirements and feature models become very complex as well. An adequate tool support is needed to manage the feature models and to support the linkage to requirements. Our research group has developed a prototype of a requirements engineering tool that supports the requirements engineering process for software product lines.
Article
The domain analysis process is used to identify and document common and variable characteristics of systems in a specific domain. In order to achieve an effective result, it is necessary to collect, organize and analyze several sources of information about different applications in this domain. Consequently, this process involves distinct phases and activities and also needs to identify which artifacts, arising from these activities, have to be traceable and consistent. In this context, performing a domain analysis process without tool support increases the risks of failure, but the used tool should support the complete process and not just a part of it. This article presents a systematic review of domain analysis tools that aims at finding out how the available tools offer support to the process. As a result, the review identified that these tools are usually focused on supporting only one process and there are still gaps in the complete process support. Furthermore, the results can provide insights for new research in the domain engineering area for investigating and defining new tools, and the study also aids in the identification of companies’ needs for a domain analysis tool.
Book
Like other sciences and engineering disciplines, software engineering requires a cycle of model building, experimentation, and learning. Experiments are valuable tools for all software engineers who are involved in evaluating and choosing between different methods, techniques, languages and tools. The purpose of Experimentation in Software Engineering is to introduce students, teachers, researchers, and practitioners to empirical studies in software engineering, using controlled experiments. The introduction to experimentation is provided through a process perspective, and the focus is on the steps that we have to go through to perform an experiment. The book is divided into three parts. The first part provides a background of theories and methods used in experimentation. Part II then devotes one chapter to each of the five experiment steps: scoping, planning, execution, analysis, and result presentation. Part III completes the presentation with two examples. Assignments and statistical material are provided in appendixes. Overall the book provides indispensable information regarding empirical studies in particular for experiments, but also for case studies, systematic literature reviews, and surveys. It is a revision of the authors' book, which was published in 2000. In addition, substantial new material, e.g. concerning systematic literature reviews and case study research, is introduced. The book is self-contained and it is suitable as a course book in undergraduate or graduate studies where the need for empirical studies in software engineering is stressed. Exercises and assignments are included to combine the more theoretical material with practical aspects. Researchers will also benefit from the book, learning more about how to conduct empirical studies, and likewise practitioners may use it as a "cookbook" when evaluating new methods or techniques before implementing them in their organization. © Springer-Verlag Berlin Heidelberg 2012. All rights are reserved.
Article
Systematic software reuse is a paradigm shift in software engineering from building single systems to building families of related systems. The goal of software reuse research is to discover systematic procedures for engineering new systems from existing assets. Implementing systematic reuse is risky. Not doing it is also risky. Trying systematic reuse unsuccessfully can cost precious time and resources and may make management sceptical of trying it again. But if your competitors do it successfully and you do not, you may lose market share and possibly an entire market. There is no cookbook solution-each organization must analyze its own needs, implement reuse measurements, define the key benefits it expects, identify and remove impediments, and manage risk. Reliable data on how much this costs and the benefits an organization will derive are insufficient. The article addresses issues from management, measurement, law, economics, libraries, and the design of reusable software
Article
The methodologies of product line engineering emphasize proactive reuse to construct high-quality products more quickly that are less costly. Requirements engineering for software product families differs significantly from requirements engineering for single software products. The requirements for a product line are written for the group of systems as a whole, with requirements for individual systems specified by a delta or an increment to the generic set. Therefore, it is necessary to identify and explicitly denote the regions of commonality and points of variation at the requirements level. In this paper, we suggest a method of producing requirements that will be a core asset in the product line. We describe a process for developing domain requirements where commonality and variability in a domain are explicitly considered. A CASE environment, named DREAM, for managing commonality and variability analysis of domain requirements is also described. We also describe a case study for an e-travel system domain where we found that our approach to developing domain requirements based on commonality and variability analysis helped to produce domain requirements as a core asset for product lines.