ArticlePDF Available

Abstract

Postmortem analysis (PMA) is a practical method for initiating knowledge management by capturing experience and improvement suggestions from completed projects. It requires little effort and quickly provides initial results, making it suitable even for small- and medium-size projects and companies. The authors describe their experiences with applying PMA techniques for collecting and analyzing experience in software organizations
Copyright ©2002 IEEE. Reprinted from IEEE Software May/June
2002.
This material is posted here with permission of the IEEE. Internal
or personal use of this material is permitted. However,
permission to reprint/republish this material for advertising or
promotional purposes or for creating new collective works for
resale or redistribution must be obtained from the IEEE by writing
to pubs-permissions@ieee.org.
By choosing to view this document, you agree to all provisions of
the copyright laws protecting it.
focus
0740-7459/02/$17.00 © 2002 IEEE
May/June 2002
IEEE SOFTWARE 43
improvement suggestions from completed
projects and works even in small- and
medium- size companies that cannot afford
extensive KM investments. However, PMA
has been mainly advocated for situations
such as completion of large projects, learning
from success, or recovering from failure.
2–4
When used appropriately, PMA ensures
that team members recognize and remember
what they learned during a project. Individ-
uals share their experiences with the team
and communicate them to other project
groups. Additionally, PMA identifies im-
provement opportunities and provides a
means to initiate sustained change.
We have applied a lightweight approach
to PMA in several projects
5,6
by focusing on
a few vital principles:
PMA should be open for participation
from the entire team and other project
stakeholders.
Goals canbut need notprovide a fo-
cus for analysis.
The PMA process comprises three
phases: preparation, data collection, and
analysis. For each phase, team members
can apply a number of fairly simple
methods, such as the KJ method (after
Japanese ethnologist Jiro Kawakita)
7
that collects and structures the data from
a group of people.
Preparation
When we conduct PMA in software com-
panies, two software process improvement
group members work as facilitators together
with two to all project team members. Facili-
tators organize the analysis, steer the discus-
sion, and document the results. They can be
employees in the company where the PMA is
conducted or external, as we are. External fa-
cilitators often have an advantage performing
the PMA because participants regard them as
more neutral and objective. However, they
might not know the company as well as inter-
nal facilitators, so preparation is important.
During the preparation phase, we walk
through the project history to better under-
stand what has happened. We review all
available documents, such as the work
breakdown structure, project plans, review
reports, and project reports.
We also determine a goal for the PMA.
Postmortem: Never Leave
a Project without It
Andreas Birk, sd&m
Torgeir Dingsøyr, Sintef Telecom and Informatics
Tor Stålhane, Norwegian University of Science and Technology
Although primarily
used for large
projects and
companies,
postmortem analysis
also offers a quick
and simple way to
initiate knowledge
management in
small- or medium-
size software
projects.
I
n every software project, the team members gain new knowledge and
experience that can benefit future projects and each members own
professional development. Unfortunately, much of this knowledge re-
mains unnoticed and is never shared between individuals or teams.
Our experience with project
postmortem analysis proves that it is an ex-
cellent method for knowledge management,
1
which captures experience and
knowledge management
Goals might be Identify major project
achievements and further improvement op-
portunities or Develop recommendations
for better schedule adherence. If a PMA
does not have a specific focus to guide our
preparation, we briefly discuss the project
with the project manager and key engineers.
We find it practical to distinguish between
two PMA types: One is a general PMA that
collects all available experience from an ac-
tivity. The other is a focused PMA for un-
derstanding and improving a projects spe-
cific activity, such as cost estimation. It helps
to explicitly state goals for both of these
PMA variants during this phase.
Data collection
In the data collection phase, we gather the
relevant project experience. Usually, project
team members and stakeholders have a
group discussion, or experience-gathering
session. We can often conduct data collec-
tion and the subsequent analysis within the
same session. You shouldnt limit experience
gathering to the projects negative aspects,
such as things to avoid in the future. Instead,
maintain a balance by identifying a projects
successful aspects, such as recommended
practices. For example, during a PMA at a
medical software company, the team realized
that the new incremental software integra-
tion process significantly improved process
control and product quality. Integration had
been so smooth that without the PMA, its
important role might have gone unnoticed.
Some techniques that we find useful for
data collection include
Semistructured interviews. The facilita-
tor prepares a list of questions, such as
What characterizes the work packages
that you estimated correctly? and
Why did we get so many changes to
the work in package X?
Facilitated group discussions. The facil-
itator leads and focuses the discussion
while documenting the main results on a
whiteboard.
KJ sessions. The participants write down
up to four positive and negative project
experiences on post-it notes. Then they
present their issues and put the notes on
a whiteboard. The participants re-
arrange all notes into groups according
to topic and discuss them.
Once the group identifies the important
topics, we must prioritize them before pro-
ceeding with the analysis. This will ensure that
we address the most significant issues first.
For example, during a PMA we per-
formed in a satellite software company, fre-
quent and late requirements changes
emerged as an important topic. A software
developer commented that during the proj-
ect, team members found it difficult to iden-
tify when the requirements had changed, so
much so that the code had to be rewritten
completely. In such situations, they made a
few wrong decisions, which reduced the
softwares quality. After this PMA session,
other project members made requirements
changes a high-priority topic for analysis.
Analysis
In this phase, as facilitators, we conduct a
feedback session in which we ask the PMA
participants: Have we understood what you
told us, and do we have all the relevant facts?
When we know that we have sufficient
and reliable data, we use Ishikawa dia-
grams
6
in a collaborative process to find the
causes for positive and negative experiences.
We draw an arrow on a whiteboard, which
we label with an experience. Then, we add
arrows with causeswhich creates a dia-
gram looking like a fishbone. In our exam-
ple from the satellite software company, we
found four causes for changing require-
ments: poor customer requirements specifi-
cation, new requirements emerging during
the project, little contact between the cus-
tomer and software company, and the soft-
ware companys poor management of re-
quirements documents.
Because PMA participants are a projects
real experts and we have time limitations,
we perform all analysis in this step.
Results and experience
Facilitators document the PMA results in a
project experience report. The report contains
A project description, including products
developed, development methods used,
and time and effort needed
The projects main problems, with de-
scriptions and Ishikawa diagrams to
show causes
The projects main successes, with de-
scriptions and Ishikawa diagrams
Once the group
identifies the
important
topics, we must
prioritize them
before
proceeding with
the analysis.
44 IEEE SOFTWARE May/June 2002
A PMA meeting transcript as an appen-
dix, to let readers see how the team dis-
cussed problems and successes
In an example from the satellite software
company, facilitators wrote a 15-page re-
port in which they documented the problem
with changing requirements with an
Ishikawa diagram that showed the four
main causes. After facilitators submit a re-
port, the knowledge management or quality
department must follow up.
In our experience, PMA is suitable when a
project reaches a milestone and when the com-
pany is looking for qualitative experience that
will help improve a similar, future project. You
should not apply PMA in situations with un-
finished activities, or when serious underlying
conflicts might remove the focus from im-
provement. If the atmosphere isnt appropriate
for discussing a projects problems, we prefer
using approaches other than PMA, such as
those outlined in Project Retrospectives: A
Handbook for Team Reviews.
2
When there
have been serious conflicts in the project, this
is more appropriate for managing the risk that
discussions degenerate into a hunt for scape-
goats. Finally, you must have enough time for
following up on PMA results.
In our experience, if teams apply PMA in
the right setting, it is an excellent step into
continuous knowledge management and im-
provement activities. It makes project team
members share and understand one anothers
perspectives, integrates individual and team
learning, and illuminates hidden conflicts. It
documents good practice and problems, and
finally, it increases job satisfaction by giving
people feedback about their work.
Performing a PMA can even improve proj-
ect cost estimation. We applied PMA to three
projects in an Internet software development
company, which all had serious cost over-
runs. The company could not allocate work-
ers with skills specific to the project. This led
to a need for coursesthe teams experts had
to act as tutors for the rest of the team and
were distracted from their roles in the proj-
ect. By performing the PMA, the company
realized the gravity of the qualification issue
and how it led to the project going over
budget. As an improvement action, a training
budget was set up on the company level in-
stead of the project level. The company no
longer charged staff qualification to the pro-
jects budget, and now views it as an invest-
ment into quality and competitive advantage.
As a result of this PMA, management real-
ized the strategic importance of staff qualifi-
cation and knowledge managementa truth
that often gets buried in the hectic rush of In-
ternet software business.
W
e received a lot of positive feed-
back from PMA participants in
different companies. Particularly,
they like that PMA offers a simple yet effec-
tive way to uncover both achievements and
improvement opportunities. One developer
at the satellite software company noted, If
you do a PMA on the project...you have to
think through things, which is a crucial
part of knowledge management. So, never
leave a project without it!
References
1. C. Collison and G. Parcell, Learning to Fly: Practical
Lessons from One of the World’s Leading Knowledge
Companies
, Capstone, New York, 2001.
2. B. Collier, T. DeMarco, and P. Fearey, A Defined
Process For Project Post Mortem Review,
IEEE Soft-
ware
, vol. 13, no. 4, July/Aug. 1996, pp. 6572.
3. N.L. Kerth,
Project Retrospectives: A Handbook for Team
Reviews
, Dorset House Publishing, New York, 2001.
4. A.J. Nolan, Learning from Success,
IEEE Software,
vol. 16 no. 1, Jan./Feb. 1999, pp. 97105.
5. T. Stålhane et al., Post MortemAn Assessment of
Two Approaches, Proc. European Software Process
Improvement
(EuroSPI 01), ICSN, Bray, Ireland.
6. T. Dingsøyr, N.B. Moe, and Ø. Nytrø, Augmenting Ex-
perience Reports with Lightweight Postmortem Re-
views,
3rd Intl Conf. Product Focused Software
Process Improvement
(Profes 01), Lecture Notes in
Computer Science, vol. 2188, Springer-Verlag, Berlin,
pp. 167181.
7. D. Straker,
A Toolbook for Quality Improvement and
Problem Solving
, Prentice Hall International, London,
1995, pp. 8998 and 117124.
May/June 2002
IEEE SOFTWARE 45
About the Authors
Andreas Birk is a consultant and software engineering professional at sd&m, software
design and management. His special interests include software engineering methods, knowl-
edge management, and software process improvement. He holds a Dr.-Ing. in software engi-
neering and a Dipl-Inform. in computer science and economics from the University of Kaiser-
slautern, Germany. He is a member of the IEEE Computer Society, ACM, and German Computer
Society. Contact him at sd&m, Industriestraße 5, D-70565 Stuttgart, Germany;
andreas.birk@sdm.de.
Torgeir Dingsøyr is a research scientist at Sintef Telecom and Informatics research foun-
dation in Trondheim, Norway. He wrote his doctoral thesis on Knowledge Management in
Medium-Sized Software Consulting Companies at the Department of Computer and Information
Science, Norwegian University of Science and Technology. Contact him at Sintef Telecom and In-
formatics, SP Andersens vei 15, NO-7465 Trondheim, Norway; torgeir.dingsoyr@sintef.no.
Tor Stålhane is a full professor of software engineering at the Norwegian University of
Science and Technology. He has a MSc in electronics, and a PhD in applied statistics from Nor-
wegian University of Science and Technology. He has worked on compiler development and
maintenance and software reliability, and on software process improvement and systems
safety. Contact him at Department of Computer and Information Science, Norwegian Univer-
sity of Science and Technology, NO-7491 Trondheim, Norway; tor.stalhane@idi.ntnu.no.
... Therefore, it is worth studying phenomena related to process quality, in order to understand and improve product quality. Project failure is a frequently embraced opportunity for post mortem organizational learning [4]. Failure can often be attributed to process anti-patterns, of which the so-called "Fire Drill" is a prominent example, due to its clearly discernible symptoms [10,74]. ...
... To the best of our knowledge, no one has previously attempted to operationalize an anti-pattern using the approach presented in this study. It was previously shown that performing post mortems is a viable path to organizational learning [4] and that learning from anti-patterns is deemed a way to eventually master management knowledge [78]. Also, it appears that examining and learning from eventuated anti-patterns is not limited to the context of software development. ...
... Its description would affect a project rather globally. However, some of SC [1][2][3][4][5][6], as well as ESC [1][2][3] convey the portrayed problematic of SC7 in parts, so that findings were assigned to these instead. During the evaluation of the raters' notes, many instances emerged that could have been assigned to the original symptoms and consequences SC [1][2][3][4][5][6][7]. ...
Article
Background: Nowadays, expensive, error-prone, expert-based evaluations are needed to identify and assess software process anti-patterns. Process artifacts cannot be automatically used to quantitatively analyze and train prediction models without exact ground truth. Aim: Develop a replicable methodology for organizational learning from process (anti-)patterns, demonstrating the mining of reliable ground truth and exploitation of process artifacts. Method: We conduct an embedded case study to find manifestations of the Fire Drill anti-pattern in n=15 projects. To ensure quality, three human experts agree. Their evaluation and the process’ artifacts are utilized to establish a quantitative understanding and train a prediction model. Results: Qualitative review shows many project issues. (i) Expert assessments consistently provide credible ground truth. (ii) Fire Drill phenomenological descriptions match project activity time (for example, development). (iii) Regression models trained on approx. 12–25 examples are sufficiently stable. Conclusion: The approach is data source-independent (source code or issue-tracking). It allows leveraging process artifacts for establishing additional phenomenon knowledge and training robust predictive models. The results indicate the aptness of the methodology for the identification of the Fire Drill and similar anti-pattern instances modeled using activities. Such identification could be used in post mortem process analysis supporting organizational learning for improving processes.
... We observed failure trends both within and across application domains. In the automotive domain, functional failures were more common, because of a reliance on cutting-edge (and faulty) computer vision components (ID 7,8,10,11,12). In the healthcare domain, the lack of a safe state led to failures, specifically when network connectivity was lost (ID 19,20). ...
... In the healthcare domain, the lack of a safe state led to failures, specifically when network connectivity was lost (ID 19,20). Across domains, using a system outside of its intended specification led to failures in autonomous cars (ID 7,8,10,11,12), consumer health monitors (ID 19,20), and smart home products (ID 16). Another cross-domain cause of failure was insecure remote access and authentication, affecting critical infrastructure (ID 1, 3, 4, 5, 6), connected cars (ID 9, 14), and consumer products (ID 15,18). ...
... Incorrect software evolution led to failures in a smart home product (ID 22) and an aircraft (ID 17). Additionally, we observed that 14 of the failure events were a result of multiple sources of failures (ID 2, 3, 4, 5, 6,7,8,9,10,11,12,15,16,22). This observation indicates opportunities to better exercise accident management techniques (i.g., Swiss cheese model [33]) for IoT development. ...
Preprint
Full-text available
As IoT systems are given more responsibility and autonomy, they offer greater benefits, but also carry greater risks. We believe this trend invigorates an old challenge of software engineering: how to develop high-risk software-intensive systems safely and securely under market pressures? As a first step, we conducted a systematic analysis of recent IoT failures to identify engineering challenges. We collected and analyzed 22 news reports and studied the sources, impacts, and repair strategies of failures in IoT systems. We observed failure trends both within and across application domains. We also observed that failure themes have persisted over time. To alleviate these trends, we outline a research agenda toward a Failure-Aware Software Development Life Cycle for IoT development. We propose an encyclopedia of failures and an empirical basis for system postmortems, complemented by appropriate automated tools.
... On the other hand, there were noticeable differences between the Execution and Managing phases, and a flexible strategy is recommended. Based on his research, Sliger [20] concludes that the PMBOK and agile methods are highly compatible. Sliger compares the PMBOK to Highsmith's [21] Agile Project Management paradigm. ...
... Agile, as described by Cockburn [10], is "mostly a mindset," not a "methodology" or "fixed collection of behaviors." In other words, agility is a strategy, not a goal in and of itself [20]. When searching for methods to increase output, product quality, customer satisfaction, and decrease production costs, the core of agility may be at the center of the answer, as was the case for Lockheed Martin [35]. ...
Thesis
Full-text available
Over the past few decades, the significance of knowledge work has caused companies to shift from a progressive way of dealing with project management to a more cooperative working style. A flexible project management system is essential for project managers in today's increasingly interconnected world, where they must respond quickly and effectively to new threats and opportunities. "Agile" project management practices are well-versed in the necessity of distributing responsibility and initiative in support of transformation to change. In this article, we will compare and contrast Agile Project Management with more conventional methods of project management, focusing on its past practices and current applicability. Considering the increased complexity and unpredictability of projects in the modern economy, Agile Project Management has emerged as a useful tool for both the modern knowledge worker and the project managers responsible for their execution. Aiming to promote the adoption of the agile approach in the business world, this document provides an overview of the framework. INTRODUCTION The idea of the perfect executive is debunked in the February 2007 issue of Harvard Business Review, which instead promotes the "incomplete leader" who is less concerned with "command and control" and more with delegating authority and encouraging individual initiative [4]. Because of the increasing value of knowledge work, businesses have been shifting from a hierarchical structure towards a more collegial one for several decades. The writers of an essay published in September 2005's issue of the Project Management Journal share similar feelings regarding the administration of projects, calling into doubt the "veracity of tight centralized management," "rationalist" discourse, and a "command and control" strategy [21]. The writers argue that local reactions should be made more malleable so that the project system can more easily adapt to new challenges as they arise. The "agile" methodology is well-versed in this need to delegate authority and encourage self-starting behavior to better accommodate unforeseen changes. In this article, we'll take a look at the origins of Agile PM and see if it makes sense to adopt its methodology for our projects. The new economy's information workers and project managers are finding that Agile Project Management is an invaluable instrument. According to Zwicker [15], Lockheed Martin discovered the agile approach while searching for a better way to create software products. Agile project management will be contrasted with more conventional methods. New Project Management Theories Complex and uncertain project circumstances are hallmarks of the modern economy, and much attention in recent years has been devoted to explaining or rethinking a hypothesis of project management that can be utilized in this setting. Authors of a study from 2002, Koskela and Howell [21], contend that the theory of project management has become defunct. The Project Management Institute (PMI) PMBOK (Project Management Body of Knowledge) guide was written with the management-as-planning model, the sending model of implementation, and the thermostat model of control as its theoretical foundations [10]. Koskela and Howell [15] express reservations about the theory's viability in the real world, particularly regarding dealing with ambiguity and innovation. We don't propose anything completely new, but we do suggest some out-of-the-ordinary details: 2) a greater emphasis on Flow and Value generation in addition to change; 3) the incorporation of management-as-organizing for planning; the language/action perspective for implementation; and the scientific experimentation model for control. Although Serum is an agile project management strategy, Koskela and Howell [16] show that it has substantial theoretical foundations. These foundations include the
... Since software is the primary outcome of software processes [109] and a strong causality between the quality of the process and the quality of the product exists [46], the quality of the development process should be improved first and foremost. While process quality may also be improved by, e.g., implementing process models or standards, another widely adopted practice is to (iteratively) enhance the process based on the results of post mortem organizational learning (i.e., from past projects) [11]. This is depicted in Figure 1.1. ...
... Organizations frequently embrace failed projects as valuable opportunities for organizational learning [11]. Using anti-patterns for efficient organizational learning from past projects is inhibited by a variety of factors today. ...
Preprint
Full-text available
Real-world software applications must constantly evolve to remain relevant. This evolution occurs when developing new applications or adapting existing ones to meet new requirements, make corrections, or incorporate future functionality. Traditional methods of software quality control involve software quality models and continuous code inspection tools. These measures focus on directly assessing the quality of the software. However, there is a strong correlation and causation between the quality of the development process and the resulting software product. Therefore, improving the development process indirectly improves the software product, too. To achieve this, effective learning from past processes is necessary, often embraced through post mortem organizational learning. While qualitative evaluation of large artifacts is common, smaller quantitative changes captured by application lifecycle management are often overlooked. In addition to software metrics, these smaller changes can reveal complex phenomena related to project culture and management. Leveraging these changes can help detect and address such complex issues. Software evolution was previously measured by the size of changes, but the lack of consensus on a reliable and versatile quantification method prevents its use as a dependable metric. Different size classifications fail to reliably describe the nature of evolution. While application lifecycle management data is rich, identifying which artifacts can model detrimental managerial practices remains uncertain. Approaches such as simulation modeling, discrete events simulation, or Bayesian networks have only limited ability to exploit continuous-time process models of such phenomena. Even worse, the accessibility and mechanistic insight into such gray- or black-box models are typically very low. To address these challenges, we suggest leveraging objectively captured digital artifacts from application lifecycle management, combined with qualitative analysis, for efficient organizational learning. A new language-independent metric is proposed to robustly capture the size of changes, significantly improving the accuracy of change nature determination. The classified changes are then used to explore, visualize, and suggest maintenance activities, enabling solid prediction of malpractice presence and -severity, even with limited data. Finally, parts of the automatic quantitative analysis are made accessible, potentially replacing expert-based qualitative analysis in parts.
... Based on this agreed state and the definition of goals, changes can be designed and implemented. Postmortems (Birk et al., 2002) are one possibility to elicit best practices but also issues in the execution of projects, feeding the results into an organizational knowledge repository (Ivarsson and Gorschek, 2012). Even though guidelines for executing postmortems exist (Collier et al., 1996;Dingsøyr, 2005), postmortem reviews are seldom held, some suggest for lack of time (Keegan and Turner, 2001;Glass, 2002), even though their benefits are well reported (Verner and Evanco, 2005). ...
Preprint
Full-text available
The development of large, software-intensive systems is a complex undertaking that we generally tackle by a divide and conquer strategy. Companies thereby face the challenge of coordinating individual aspects of software development, in particular between requirements engineering (RE) and software testing (ST). A lack of REST alignment can not only lead to wasted effort but also to defective software. However, before a company can improve the mechanisms of coordination they need to be understood first. With REST-bench we aim at providing an assessment tool that illustrates the coordination in software development projects and identify concrete improvement opportunities. We have developed REST-bench on the sound fundamentals of a taxonomy on REST alignment methods and validated the method in five case studies. Following the principles of technical action research, we collaborated with five companies, applying REST-bench and iteratively improving the method based on the lessons we learned. We applied REST-bench both in Agile and plan-driven environments, in projects lasting from weeks to years, and staffed as large as 1000 employees. The improvement opportunities we identified and the feedback we received indicate that the assessment was effective and efficient. Furthermore, participants confirmed that their understanding on the coordination between RE and ST improved.
... The point of project postmortems is learning not to repeat past mistakes in future projects. The view that IT project postmortem analysis of failed IT projects is necessary is widely supported (Ahonen and Savolainen, 2010;Birk et al., 2002;Ewusi-Mensah, 2003;Glass, 2001;Nelson, 2005;Reel, 1999;Verner et al., 2005;Williams, 2004). Systematic project retrospectives are recommended by the Project Mangement Institute (PMI, 2021a,b). ...
Article
Full-text available
Information technology (IT) projects often fail. Postmortem analysis is not general practice in IT project management. This is a missed opportunity for IT project management because postmortem analysis is a proven source of practice improvements and preventive actions in other domains. In this paper, the root causes of failure of a major IT project are identified by postmortem analysis, a well-established method for investigating accidents and failure ex post facto to improve practice and performance. The root causes of failure identified are: a) inadequate planning, b) novelty of a technology to the organisation, and c) inappropriate software development method and process. The postmortem offers insights into risks and challenges that IT projects still face today. Significantly, the postmortem analysis shows how a different approach to project planning could have prevented the failure and termination of the project. This paper also demonstrates how systematic IT project postmortem analysis can be conducted based on leading theory of process tracing and causal modelling in combination with the literature on IT project failure. The demonstration of this approach to IT project postmortems is new and original.
... The 4th and 5th day are dedicated to the consolidation, processing, and discussion of the results from the small groups as well as the presentation of the research results. The project is concluded with a so-called post-mortem analysis [26]. ...
Conference Paper
Research is a challenging aspect for students for several reasons: Research is a complex process that is usually addressed in many small-scale steps in a wide variety of courses. However, students often fail to see the big picture. This paper describes a competence-oriented inquiry-based learning approach to improve psychology students' research skills. In a capstone project, students are guided by the instructor through a complete research process in which they define their own research question, decide independently on the research design, and conduct and document the research. They conduct guided interviews on "explor-ative sexual research". Evaluation shows the high gain of competence for students. At some point, students ask for a little more assistance for example by preparing a research report.
Chapter
Research is a challenging aspect for students for several reasons: Research is a complex process that is usually addressed in many small-scale steps in a wide variety of courses. However, students often fail to see the big picture. This paper describes a competence-oriented inquiry-based learning approach to improve psychology students’ research skills. In a capstone project, students are guided by the instructor through a complete research process in which they define their own research question, decide independently on the research design, and conduct and document the research. They conduct guided interviews on “explorative sexual research”. Evaluation shows the high gain of competence for students. At some point, students ask for a little more assistance for example by preparing a research report.
Conference Paper
Full-text available
Many small and medium-sized companies that develop software experience the same problems repeatedly, and have few systems in place to learn from their own mistakes as well as their own successes. Here, we propose a lightweight method to collect experience from completed software projects, and compare the results of this method to more widely applied experience reports. We find that the new method captures more information about core processes related to software development in contrast to experience reports that focus more on management processes.
Conference Paper
Full-text available
Learning from experience is the key to successes for all that develop software. Both the successes and the failures in software projects can help us to improve. Here we discuss two versions of Post Mortem Analysis (PMA) as methods for harvesting experience from completed software projects, which can be part of a larger knowledge management program. The two methods are tailored for use in small and medium size companies and are conceptually easy to apply. In addition, they require few resources compared to other methods in the field. We think that the methods are useful for companies when they need to document their knowledge, find improvement actions and as a start of systematic knowledge harvesting.
Article
Every organization experiences a range of success. In an attempt to improve business processes, many organizations focus on those projects that fail in order to `learn from their mistakes'. Whereas the author does not disagree with learning from mistakes, many organizations overlook learning from their successes. The result of any project is a direct consequence of the actions and activities used on the project. Therefore, if you believe that failure is no accident, you must believe also that success is no accident. `Learning from Success' is a study on accelerated process improvement and provides a methodology for uncovering why successful projects are indeed successful.
Article
An abstract is not available.
Article
Whereas perpetual refinement is an established norm in many businesses, the author presents a case for tapping into an organization's expertise and learning from its own successes and best practices. Why are successful projects successful? This study on rapid process improvement provides a methodology for uncovering success factors and implementing them throughout an organization
Article
Most of us pay lip service to the need for software project post mortems, but the literature offers little guidance on how to conduct them. The authors propose a tentative, standard process for conducting post mortem reviews and describe activities, roles, and artifacts of the process. The success of the post mortem-or of any learning process-demands a context that makes organizational learning possible. Management must make an honest and sincere commitment to establish this context. This commitment should take the form of a public resolution to implement risk management on subsequent projects and to make all post mortem findings input to that risk management effort. After all, lessons learned the hard way on past projects are, if nothing else, risks for future projects. Participants are empowered when they know that each issue raised during the post mortem process must be added to the risk database and evaluated methodically on each subsequent project.
Augmenting Experience Reports with Lightweight Postmortem Reviews," <i>3rd Int'l Conf. Product Focused Software Process Improvement&lt
  • T Dingsøyr
  • N B Moe
  • Ø Nytrø