Diagram showing the steps used during the reconstruction of the project specific software development method. BM is used as acronym for "base method".

Diagram showing the steps used during the reconstruction of the project specific software development method. BM is used as acronym for "base method".

Source publication
Article
Full-text available
Software development is a complex process that requires disciplined engineering approaches. Empirical studies show that companies still don’t document their development practice, or if they do, these are not up-to-date and do not reflect how they really develop software. The main objective of this paper is to propose an approach that can help compa...

Context in source publication

Context 1
... and to reconstruct more than just the disciplines. In this section, we first describe the meta model that was constructed in cooperation with the participating companies and then describe how this information can be reconstructed from the repositories. The steps for reconstructing the project specific software development method are shown in the Fig. 2. In the figure each step has a link to the section in which it is explained in more details. ...

Similar publications

Article
Full-text available
Automatic static analysis tools (ASATs) are instruments that support code quality assessment by automatically detecting defects and design issues. Despite their popularity, they are characterized by (i) a high false positive rate and (ii) the low comprehensibility of the generated warnings. However, no prior studies have investigated the usage of A...

Citations

... Some researchers have mapped these solutions to general, customizable practices. For example, Janković et al. have analyzed data repositories of projects conducted in specific companies, and elicited the methodology constituents and the relationships between them [19]. In fact, they have used a bottom-up approach to find reusable methodology constituents by generalizing the recurrent solutions extracted from data repositories of specific projects. ...
Article
Full-text available
Earlier software development processes (SDPs), such as waterfall processes, were mainly focused on process steps and did not address people- and product-related issues. Emergence of Software development methodologies (SDM) has created a new paradigm for developing software systems. A SDM is a special kind of technically engineered framework for organizing SDPs; this framework is expected to specify three main interwoven elements, namely people, products, and process. It has since become evident that it is impossible to provide a general-purpose SDM for developing all the various kinds of software systems, and it has thus become essential to construct the most appropriate methodology for the system development situation in hand, a practice commonly called Situational Method Engineering (SME). The problem with existing SME methods is lack of adequate attention to the role of people who might seek or possess valuable knowledge about the project situation. This knowledge can be tacit information that is hidden in the developer’s mind, or it might be explicitly available. This paper proposes a knowledge management (KM)-driven and DevOps-based SME method as a new integrated multi-view methodological paradigm that satisfies the need for sharing human experience in engineering SDMs. The method has been proposed by reusing general SME practices and complementing them by embedding appropriate KM and DevOps practices to alleviate the weaknesses of previous SME methods. Furthermore, the proposed method has been evaluated through four case studies and also by conducting a criteria-based comparison with eight prominent SME methods.
... Remarkably, the results found tend to suggest that in the presence of a set of projects -concerning the applying domain, the nature of computation performed, and the implementation technology -it is possible to get more accurate estimates of the functional (Lavazza & Liu, 2019). Before specifying the requirements specification in detail, it is possible to make an early assessment if there is not enough time to apply some other standard methods (Borandag et al., 2019;Janković et al., 2019). ...
Article
Full-text available
This paper proposes a new, improved COmmon Software Measurement International Consortium function point (COSMIC FP) method that uses Artificial Neural Network (ANN) architectures based on Taguchi’s Orthogonal Array to estimate software development effort. The minimum magnitude relative error (MRE) to evaluate these architectures considering the cost effect function, the type of data used in the training, testing, and validation of the proposed models, was used. Applying the fuzzification and clustering method to obtain seven different datasets, we would like to achieve excellent reliability and accuracy of the obtained results. Besides examining the influence of four input values, we aim to reduce the risks of potential errors, increase the coverage of a wide range of different projects and increase the efficiency and success of completing many various software projects. The main contributions of our work are as follows: the influence of four input values of the COSMIC FP method on the change of mean (MRE), development of two simple ANN architectures, the attainment of a small number of performed iterations in software effort estimation (less than 7), reduced software effort estimation time, the use of different values of the International Software Benchmarking Standards Group and other datasets used in the experiment.
... These numerical values of 15 cost drivers in COCOMO81 and seventeen cost drivers in COCOMO2000 are increased to induce the trouble adjustment factor, that is, EAF. The performance of estimation [40] strategies is sometimes evaluated by many quantitative relation measurements of accuracy metrics [41] as well as RE (relative error), MRE (magnitude of relative error), MAE (mean absolute error) [42], and MMRE (mean magnitude of relative error) that are calculated as follows (12) ...
Article
Full-text available
In this paper, two different architectures of Artificial Neural Networks (ANN) are proposed as an efficient tool for predicting and estimating software effort. Artificial Neural Networks, as a branch of machine learning, are used in estimation because they tend towards fast learning and giving better and more accurate results. The search/optimization embraced here is motivated by the Taguchi method based on Orthogonal Arrays (an extraordinary set of Latin Squares), which demonstrated to be an effective apparatus in a robust design. This paper aims to minimize the magnitude relative error (MRE) in effort estimation by using Taguchi’s Orthogonal Arrays, as well as to find the simplest possible architecture of an artificial Neural Network for optimized learning. A descending gradient (GA) criterion has also been introduced to know when to stop performing iterations. Given the importance of estimating software projects, our work aims to cover as many different values of actual efficiency of a wide range of projects as possible by division into clusters and a certain coding method, in addition to the mentioned tools. In this way, the risk of error estimation can be reduced, to increase the rate of completed software projects.
Article
Full-text available
The version control system of every software product can provide important information about how the system is connected. In this study, we first propose a language-independent method to collect and filter dependencies from the version control, and second, we use the results obtained in the first step to identify key classes from three software systems. To identify the key classes, we are using the dependencies extracted from the version control system together with dependencies from the source code, and also separate. Based on the results obtained we can say that compared with the results obtained by using only dependencies extracted from code, the mix between both types of dependencies provides small improvements. And, by using only dependencies from the version control system, we obtained results that did not surpass the results previously mentioned, but are still acceptable. We still consider this an important result because this might open an important opportunity for software systems that use dynamically typed languages such as JavaScript, Objective-C, Python, and Ruby, or systems that use multiple languages. These types of systems, for which the code dependencies are harder to obtain, can use the dependencies extracted from the version control to gain better knowledge about the system.