Statistical analysis of deviation of actual cost from estimated cost using actual project data

Graduate School of Engineering Science, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka 560-8531, Japan
Information and Software Technology (Impact Factor: 1.05). 05/2000; 42(7):465-473. DOI: 10.1016/S0950-5849(00)00092-6
Source: DBLP


This paper analyzes an association of a deviation of the actual cost (measured by person-month) from the estimated cost with the quality and the productivity of software development projects. Although the obtained results themselves may not be new from the academic point of view, they could provide motivation for developers to join process improvement activities in a software company and thus become a driving force for promoting the process improvement.We show that if a project is performed faithfully under a well-organized project plan (i.e. the plan is first constructed according to the standards of good writing, and then a project is managed and controlled to meet the plan), the deviation of the actual cost from the estimated one becomes small. Next we show statistically that projects with small deviation of the cost estimate tend to achieve high quality of final products and high productivity of development teams. In this analysis, the actual project data on 37 projects at a certain company are extensively applied.

Download full-text


Available from: Osamu Mizuno, Jun 21, 2014
54 Reads
  • Source
    • "Jorgensen et al. [14] evaluated an adjustment approach to estimate project values where they adjusted the estimates toward productivity values of the more average projects. Mizuno et al. [17] "
    [Show abstract] [Hide abstract]
    ABSTRACT: When the final tally was in, the year 2000 (Y2K) software compliance issue had cost over a hundred billion dollars worldwide. The fact that essentially everyone was busy tackling the same problem provided a unique opportunity to use data envelopment analysis (DEA) to measure software team efficiency and productivity. The data set analyzed in this paper contained about 70 programs from a large Canadian bank. While there were about a dozen different programming languages and a number of hardware platforms involved, the work was very similar in nature as they were all fixing the Y2K "bug". We examined both team productivity and programmer efficiency when maintaining code where the maintenance objective was the same in all cases. DEA models were developed to measure software project efficiency focusing on the factors that affect software productivity, and we discuss how these findings could be applied to other projects. Suggestions are offered on how DEA could be combined with the bank's own ratio-based rating system to improve their software production metrics. Finally, potential management uses of these DEA results are presented.
    IEEE Transactions on Engineering Management 09/2004; 51(3-51):279 - 287. DOI:10.1109/TEM.2004.830843 · 1.10 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: 博士(工学)/第16500号/平成13年9月20日授与
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This paper provides an extensive review of studies related to expert estimation of software development effort. The main goal and contribution of the review is to support the research on expert estimation, e.g., to ease other researcher’s search for relevant expert estimation studies. In addition, we provide software practitioners with useful estimation guidelines, based on the research-based knowledge of expert estimation processes. The review results suggest that expert estimation is the most frequently applied estimation strategy for software projects, that there is no substantial evidence in favour of use of estimation models, and that there are situations where we can expect expert estimates to be more accurate than formal estimation models. The following 12 expert estimation “best practice” guidelines are evaluated through the review: (1) evaluate estimation accuracy, but avoid high evaluation pressure; (2) avoid conflicting estimation goals; (3) ask the estimators to justify and criticize their estimates; (4) avoid irrelevant and unreliable estimation information; (5) use documented data from previous development tasks; (6) find estimation experts with relevant domain background and good estimation records; (7) Estimate top-down and bottom-up, independently of each other; (8) use estimation checklists; (9) combine estimates from different experts and estimation strategies; (10) assess the uncertainty of the estimate; (11) provide feedback on estimation accuracy and development task relations; and, (12) provide estimation training opportunities. We found supporting evidence for all 12 estimation principles, and provide suggestions on how to implement them in software organizations.
    Journal of Systems and Software 02/2004; 70(1-2-70):37-60. DOI:10.1016/S0164-1212(02)00156-5 · 1.35 Impact Factor
Show more