Article

Planning and executing time-bound projects

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

A time-bound project is constrained by hard deadlines. Since most time-bound projects start with more requirements than developers can handle within the imposed time constraints, requirements often must be slashed halfway through the project, resulting in missed deadlines, customer frustration, and wasted effort. Statistically Planned Incremental Deliveries offer an approach that addresses these problems by combining ideas from critical chain planning, incremental development, and rate monitoring into a practical method for planning and executing time-bound projects.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... To be effective, time boxing requires that [1]: ...
... The "Should have" represents high-priority items that should be included in the solution if possible. 1 This two terms will be used loosely and alternatively to refer to a discrete capability requested by the sponsor of the work ...
... It is clear that by scheduling features at the safe level, the most work we can accommodate within the time box boundaries is that depicted by the patterned area in Figure 1.b. So for the "must have" category the 2 More sophisticated approaches such as Statistically Planned Incremental Deliveries -SPID [1] will require three points estimates and the specification of an underlying distribution 3 As I did in the redefining of the MOSCOW categories in this article I am avoiding the temptation of calling these estimates the 50% and the 90% probability estimates to prevent giving a false sense of mathematical exactness, which will require the making of additional assumptions or an analysis that might not be justified by the practical impact of the added accuracy and precision. 4 If a single project had to ensure against all possible risks and uncertainty, its price would be prohibitive [5] customer must select, from among all requirements, those which are more important for him until exhausting the number of development hours available while scheduling them at the safe effort level. ...
Article
Full-text available
Time boxing is a management technique which prioritizes schedule over deliverables but time boxes which are merely a self, or an outside, imposed target without agreed partial outcomes and justified certainty are at best, an expression of good will on the part of the team. This essay proposes the use of a modified set of Moscow rules which accomplish the objectives of prioritizing deliverables and providing a degree of assurance as a function of the uncertainty of the underlying estimates.
... The classification was made on the basis of the requirements' own value and was unconstrained, i.e. all the requirements meeting the criteria for "Must Have" could be classified as such. In 2002, the SPID method [6] used a probabilistic backcasting approach to define the scope of three software increments roughly corresponding to the Must Have, Should Have and Could Have categories, but constraining the number of Must Have to those that could be completed within budget at a level of certainty chosen by the organization. In 2006, the DSDM Consortium, now the Agile Business Consortium, published the DSDM Public Version 4.2 [7] establishing the 60/20/20% recommendation although this, was probably used before by Consortium's members on their own practices. ...
... independent estimates global correlation coefficient for 0. 6 correlated estimates subject to the maximum allocation of effort for each category: ...
Chapter
Full-text available
This article analyzes the performance of the MoSCoW method to deliver all features in each of its categories: Must Have, Should Have and Could Have using Monte Carlo simulation. The analysis shows that under MoSCoW rules, a team ought to be able to deliver all Must Have features for underestimations of up to 100% with very high probability. The conclusions reached are important for developers as well as for project sponsors to know how much faith to put on any commitments made.
... Uncertainty is the root of all evil [7]. Therefore effective planning techniques are needed to deal with uncertainties. ...
... Iterative planning helps managing uncertainty in requirements change in another way by suggesting the development team to implement features of higher priority in early iterations. This works in our case since our project is a time boxed (we had to finish the development within the time of the academic program) and also it meets the requirements proposed in [7]. Therefore in our case, functional requirements were grouped into simply two categories, which are "Must-Have" and "Nice-to-Have". ...
Article
Planning is a vital to success in software project. The only exception might be a trivial "Hello World" project. Although its importance is well known by software developers, there are still something that hold them back when they decide whether to do planning or not. For example, time pressure can be one of these excuses, because a good plan does need to invest significant time to create and maintain. Yet another common excuse is uncertainty. Uncertainty comes from the assumptions that we have to make in order to continue the project. The completion time of tasks depends on whether those assumptions are true or false. Because of these assumptions, completion of tasks is not deterministic. Yet project plan is made upon those assumptions. Unfortunately the probability of all assumptions to be true in a software project is very low[7]. So people argue that a plan will be changed in future no matter how good a plan it is, so why do we bother to create one in first place? This is true only in the first part but not the second part. Uncertainty is universally true according to physic laws. However, planning well with solid techniques can help us manage the uncertainty for the best of the software project. In fact, uncertainty is actually a reason that calls for planning. I. BACKGROUND This paper is based on a software project done in Master of Software Engineering program in Carnegie Mellon University. Master of Software Engineering program is a 16-month long program hosted in Carnegie Mellon University. As part of the study in this program, students are grouped as small development teams with 4 – 6 members and with two mentors' supervision each team is assigned with a real world software project to practice software engineering techniques learned in class. The project has to be finished successfully within the period of the program. The project under study is called Bennu project. Its client is Bennu Incorporated. The project goal is to re-architect client's product X to solve its problem in maintainability and extensibility. The project goal was achieved At present, the project is being in the stage of knowledge transfer. The project was a successful project in terms of meeting the project goal. Nevertheless, there are missteps being made during the course of the project. Specifically, planning is a constant area to improve along the way. Many lessons were learned in how to better plan a project. This paper will first present a general planning process that was followed in the project. After that problems that happened with planning will be analysed. Particularly, planning difficulties associated with uncertainty and how iterative planning could help in managing uncertainties will be discussed based on the project under study.
... The classification was made on the basis of the requirements' own value and was unconstrained, i.e. all the requirements meeting the criteria for "Must Have" could be classified as such. In 2002, the SPID method [6] used a probabilistic backcasting approach to define the scope of three software increments roughly corresponding to the Must Have, Should Have and Could Have categories, but constraining the number of Must Have to those that could be completed within budget at a level of certainty chosen by the organization. In 2006, the DSDM Consortium, now the Agile Business Consortium, published the DSDM Public Version 4.2 [7] establishing the 60/20/20% recommendation although this, was probably used before by Consortium's members on their own practices. ...
Preprint
Full-text available
This article analyzes the performance of the MoSCoW method to deliver all features in each of its categories: Must Have, Should Have and Could Have using Monte Carlo simulation. The analysis shows that under MoSCoW rules, a team ought to be able to deliver all Must Have features for underestimations of up to 100% with very high probability. The conclusions reached are important for developers as well as for project sponsors to know how much faith to put on any commitments made. TO CITE THIS PAPER: Miranda, E. (2022). Moscow Rules: A Quantitative Exposé. In: Stray, V., Stol, KJ., Paasivaara, M., Kruchten, P. (eds) Agile Processes in Software Engineering and Extreme Programming. XP 2022. Lecture Notes in Business Information Processing, vol 445. Springer, Cham. https://doi.org/10.1007/978-3-031-08169-9_2
... and their usefulness has become established in many projects at Ericsson, the author's former employer. The techniques in question include the Paired Comparison estimation method (Miranda 2001a), the statistically planned Incremental Development method (Miranda 2002), the Rate of Growth Monitoring method (Miranda 1998), the line-of-balance (LOB) indicator for tracking progress ) and a project screening method (Miranda 2001b). ...
... Osim problematike definiranja mjerenja, te optimalne, efektivne i efikasne metrike, mnoge studije provedene na industrijskim studijskim slučajevima predlažu i opisuju primjenu različitih programa poboljšanja dajući pritom i dokaze kroz evaluaciju. Osim standardnih predloženih programa poboljšanja analiziraju se i primjene konkretnih prijedloga poboljšanja temeljenih na različitim modelima i metodama kaošto je to primjerice korištenje metoda i modela rudarenja baze podataka [37] i [38], poboljšan pristup ranom otkrivanju rizika koristeći principe fuzzy logike [39], kombiniranje metode kritičnog puta i modela inkrementalnog razvoja programskog proizvoda u [40] itd. ...
... Optionally a 'won't have' set can be used to store the rest of the story backlog. A similar approach is taken in [26], where it is proposed to plan time-bound projects in several increments with decreasing completion probabilities. However, in our model, the story sets do not correspond directly to iterations as the scope of an iteration is decided in iteration planning. ...
Article
Context: Extreme Programming (XP) is one of the most popular agile software development methodologies. XP is defined as a consistent set of values and practices designed to work well together, but lacks practices for project management and especially for supporting the customer role. The customer representative is constantly under pressure and may experience difficulties in foreseeing the adequacy of a release plan. Objective: To assist release planning in XP by structuring the planning problem and providing an optimization model that suggests a suitable release plan. Method: We develop an optimization model that generates a release plan taking into account story size, business value, possible precedence relations, themes, and uncertainty in velocity prediction. The running-time feasibility is established through computational tests. In addition, we provide a practical heuristic approach to velocity estimation. Results: Computational tests show that problems with up to six themes and 50 stories can be solved exactly. An example provides insight into uncertainties affecting velocity, and indicates that the model can be applied in practice. Conclusion: An optimization model can be used in practice to enable the customer representative to take more informed decisions faster. This can help adopting XP in projects where plan-driven approaches have traditionally been used.
Chapter
Thirukkural by Thiruvalluvar contains couplets that speak about the morale necessary for an individual based on the roles played in various circumstances of life. These are applied to various fields including management even today. In this chapter, the authors conduct a narrative analysis on two major aspects of management skills to be inculcated in managers for successful progression of the organization. Execution is one such important aspect of management which plays a significant role in constructing effective doable strategies and executing the strategies without delay after proper analysis, thus sustaining the motivation level of the team and progress.
Article
Time-to-market has become more and more critical for software development projects. Thus, the time available for requirements engineering is drastically reduced and at the same time, the requirements are subject to frequent changes due to fast and fierce competition. Whoever releases a product containing the most valuable features first will profit the most. This paper reviews the current state of the art in the field of requirements engineering for dynamic markets regarding traditional as well as the socalled agile methodologies. Methodologies that promise cycle time reduction as well as approaches that address changing requirements are reviewed. This review is going to be the basis for further research that will focus on two main aspects: The first one is the integration of change management methodologies into rapid application development processes or more generally, tying both speed and proper change management together. The second aspect is concerned with agile methodologies and their applicability for different kinds of projects. The possibility for developing a framework for facilitating the choice of a suitable (agile) methodology or the tailoring of an existing one shall be investigated.
Article
Full-text available
In the framework of the APEX (Airborne Prism Experiment) pushbroom imaging spectrometer, a complete processing and archiving facility (PAF) is developed. The PAF not only includes imaging spectrometer data processing up to physical units, but also geometric and atmospheric correction for each scene, as well as calibration data input. The PAF software includes an Internet based web-server and provides interfaces to data users as well as instrument operators and programmers. The software design, the tools and its life cycle is discussed as well. Further we will discuss particular instrument requirements (resampling, bad pixel treatment, etc.) in view of the operation of the PAF as well as their consequences on the product quality. Finally we will discuss a combined approach for geometric and atmospheric correction including BRDF (or view angle) related effects.
Conference Paper
When determining the functionality to complete in upcoming software releases, decisions are typically based upon uncertain information. Both the business value and cost to develop chosen functionality are highly susceptible to uncertainty. This paper proposes a relatively simple statistical methodology that allows for uncertainty in both business value and cost. In so doing it provides key stakeholders the ability to determine the probability of completing a release on time and to budget. The technique is lightweight in nature and consistent with existing agile planning practices. A case study is provided to demonstrate how the method may be used.
Article
Multi-project management is crucial in Software Engineering as it draws the resources from common pools, affects the completion date of other projects, determines the priority of use of resources among various projects, involves the judgment of multi-tasking of a common resource, and eventually, determines the success or failure of the projects. Hence, this paper argues that a formal simulation model using System Dynamics principles should be built to study the dynamics of software multi-project management. However, System Dynamics modelling by itself lacks the capability to construct the multi-project network, and thus confines the use of simulation in a single project environment. Thus, this paper is proposing an integration of the System Dynamics model with a multi-project network constructing method, called Critical Chain Project Management (CCPM). CCPM, not only constructs the network, but also recognizes the interdependencies of the multiple projects. However, the combination of these two principles does not simulate unexpected situations, change of policies and strategies that may be encountered during the project development. Hence, a Scenario model is proposed to be integrated with the System Dynamics and CCPM. With such integration, the project manager can identify the restraining factors in various possible scenarios in the multi-project environment, and provide feasible solutions to the senior management.
Conference Paper
Full-text available
This article describes the use of reliability growth models for planning resources, monitoring progress and performing risk analysis of testing activities.
Article
Full-text available
This research effort, sponsored by the Program Executive Office for Air ASW, Assault, and Special Mission Programs (PEO(A)), is known as the Navy PEO(A) Technical Performance Measurement (TPM) System. A retrospective analysis was conducted on the T45TS Cockpit-21 program and real-time test implementations are being conducted on the Federal Aviation Administration's (FAA) Wide Area Augmentation System (WAAS) program, the Navy's H-1 helicopter upgrade program, and is currently under consideration for other test implementations across the Department of Defense (DoD) and in private industry. Currently-reported earned value data contains invaluable planning and budget information with proven techniques for program management, however, shortcomings of the system are its emphasis on retrospection and lack of integration with technical achievement. The TPM approach, using the techniques of risk analysis and probability, offers a promising method to incorporate technical assessments resulting systematically from technical parameter measurements to derive more discrete management data sufficiently early to allow for cost avoidance. Results obtained from TPM pilot programs, particularly the Cockpit-21 program, support this premise. Several preliminary issues of interest and conclusions are delineated in this paper that demonstrate that the TPM methodology is a powerful integrated diagnostic tool in support of the new paradigm advocating a multidisciplinary approach to program management. It also promises to provide a powerful new tool in proactive risk management.
Article
Book
From the Book:Corporate and commercial software development teams all want solutions for one important problem--how to get their high-pressure development schedules under control. In Rapid Development, author Steve McConnell addresses that concern head-on with overall strategies, specific best practices, and valuable tips that help shrink and control development schedules and keep projects moving. Inside, you'll find: The best rapid-development strategies that can be applied to any project Candid discussions of great and not-so-great rapid-development practices--estimation, prototyping, forced overtime, motivation, teamwork, rapid-development languages, risk management, and many others A list of classic mistakes to avoid for rapid-development projects, including creeping requirements, shortchanged quality, and silver-bullet syndrome Case studies that vividly illustrate what can go wrong, what can go right, and how to tell in which direction your project is headed Rapid Developement is the real-world guide to more efficient applications development
Article
484 p., fig., 1 disquette souple The methodology used in technological forecasting is made crystal clear in this essential reference for engineers, managers, government strategic planners, and students. Providing a careful balance of theory and practical applications, the book shows technological forecasting at work in business, industry, and government―with well-chosen examples that illustrate both the strengths and limitations of each technique. Demonstrating the key point that technological change can be anticipated, forecast, and managed effectively, the book contains such special features as: The rationale behind each important method in current use-and why it works! Vital applications in business, government, and R&D planning Highly useful data tables that provide historical examples and case studies Complete listings of applicable computer programs, with 3.5" and 5.25" computer disks included New sections on probabilistic methods, never before available in book form Numerous historical examples and hands-on exercises And much more! For practicing engineers, the book offers the nuts and bolts of preparing technological forecasts, including their mathematical derivations; for managers, it offers a full understanding of how the forecasts can best be applied and what problems may arise. For all readers, it is an indispensable decision-making tool that will greatly enhance their on-the-job effectiveness.
Article
From the Publisher:This book helps you accurately measure the completion time frames for small-to-medium software development projects, with practical techniques for performing software estimates, productivity measurements and quality forecasts. It forms a common underlying methodology, helping you plan the project, create a budget, and set schedules and quality standards. Throughout, the handbook answers the management questions you've always been asking yourself about software projects, including: How long is it going to take? ... How much will it cost? ... How many people will I need? ... What is my risk on meeting the budget? ... What is my risk on meeting the schedule? Appropriate for software engineers, developers, and managers.
Article
Several algorithmic models have been proposed to estimate software costs and other management parameters. Early prediction of completion time is absolutely essential for proper advance planning and aversion of the possible ruin of a project. L.H. Putnam's (1978) SLIM (Software LIfecycle Management) model offers a fairly reliable method that is used extensively to predict project completion times and manpower requirements as the project evolves. However, the nature of the Norden/Rayleigh curve used by Putnam renders it unreliable during the initial phases of the project, especially in projects involving a fast manpower buildup, as is the case with most software projects. In this paper, we propose the use of a model that improves early prediction considerably over the Putnam model. An analytic proof of the model's improved performance is also demonstrated on simulated data
The Use of Reliability Growth Models in Project Management," <i>Proc.</i> <i>9th Int'l Symp. Software Reliability,</i&gt
  • E Miranda