Content uploaded by H. James Nelson
Author content
All content in this area was uploaded by H. James Nelson on May 20, 2015
Content may be subject to copyright.
Measuring the Effectiveness of a Structured Methodology: A
Comparative Analysis
Kay M. Nelson (knelson@ukans.edu)
The University of Kansas, Division of Accounting and Information Systems
Mehdi Ghods
(
Mehdi.Ghods@boeing.com)
The Boeing Company
H. James Nelson (jnelson@ ukans.edu)
The University of Kansas, Division of Accounting and Information Systems
Abstract
This study evaluates a vendor supplied structured
methodology which was implemented in a large
manufacturing organization. Thirty projects using the
methodology are measured for efficiency and
effectiveness of software maintenance performance. The
performance of these projects is evaluated using both
objective metrics and subjective measures taken from
stakeholders of the software applications. The
performance results of these thirty systems are then
contrasted to the performance results of thirty-five
applications in the same organization that do not use this
structured methodology and to one hundred sixteen
applications across eleven other organizations. All of
these applications had been in operation at least six
months when they were studied. The software
applications developed and maintained using the
structured methodology were found to have some
significant cost and quality performance gains over the
applications that do not use this methodology
1. Introduction
Structured methodologies are not usually viewed as
emerging technologies, yet more and more organizations
are implementing them. Although they have been
around for over twenty-five years, these methodologies
have recently experienced a renaissance [1]. The reasons
cited for developing or purchasing structured
methodologies include cost containment, quality control,
schedule control, and process improvement. As many
organizations chose or are required to adopt standards for
their information systems processes such as ISO 9000 or
SEI/CMM, they see structured methodologies as a means
to this end [2]. In this way, these methodologies are re-
emerging as a significant factor in software maintenance
and development.
The term structured methods refers to a philosophy of
software development which emphasizes an adherence to
a set of consistent rules or methods throughout a project
[3], [4], [5], [6]. These methods include broad initiatives
such as “Systems Development Lifecycles and Methods”
and “Information Engineering” as well as individual
techniques such as structured programming, data flow
diagramming, data modeling, and object oriented
modeling. The specific set of rules or methods used can
come from a variety of sources. Organizations often
implement their own methodologies for software
development. Commercially produced methodologies are
available from vendors and consultants.
The primary objectives of structured methodologies can
be summarized as follows [7]:
•
Achieve high-quality programs of predictable
behavior
•
Achieve programs that are easily modifiable
•
Simplify the development process
•
Control the development process
•
Speed up system development
•
Lower the cost of system development
What is not often directly addressed in the academic
literature on structured methodology, but is equally
important, is the contribution these methods can make to
software maintenance and process control. Software
maintenance is a critical element of the information
systems function in today's organization. As much as
90% of the software life-cycle effort and cost can be spent
in maintenance [8], [9]. Many of the features of
structured methodologies, such as configuration
management and documentation, are directly tied to
software process control [2].
Structured methodologies support development of
easily maintainable applications through decomposition
of complex problems and constructs into simpler ones,
the use of modeling and diagramming techniques,
achievement of code clarity and readability, and earlier
1060-3425/98 $10.00 (c) 1998 IEEE
error detection [10]. Improved communication within
the maintenance team and with end users, improved
documentation, and repeatable procedures also contribute
to process control. Some methodologies provide for
repositories and libraries of code and modules which
encourage reuse and directly impact the time and money
spent on redundant tasks [11].
2. Why structured methods make a difference
Structured methods can make a difference to the long
term performance of software systems in many ways.
Using structured methods can effect development and
maintenance team efficiency and effectiveness. The
overall quality and business value of the delivered system
can be improved. User satisfaction with product
attributes such as the format of information, the content
of information, ease of use of the system, timeliness of
information, and accuracy of information, as well as
overall user satisfaction have also been shown to be
impacted by structured methods.
Programmers and analysts are not always organized in
their work habits, and structured methods can increase
their productivity through standardization of methods
and outputs [5]. This can be accomplished on two
dimensions, efficiency and effectiveness. Efficiency is
the rate at which programmers produce programs.
Efficiency can be measured in terms of meeting user
needs or meeting specified deadlines. It can also be
measured as how much of the work is performed
correctly the first time and does not need reworking.
Recently, the amount of reuse by programmers and
designers is considered a measure of efficiency [12].
Effectiveness addresses the quality of the product
produced by software teams. Does the application
capture the needs of the business process? Is it delivered
relatively bug free? Does it produce the right
information at the right time? Is it delivered at a cost
that makes it a value to the organization?
Structured methods can reduce the impact of
differences in programmers' abilities [6]. Structured
methods seek to formalize the instinctive good practices
of experienced programmers in a way that can be taught
to programmers of all experience and ability levels.
Examples of these practices are the breaking down of a
large system into modules, well organized coding, and
complete and accurate documentation [1].
Yet another reason why structured methods impact
developer efficiency and effectiveness is the percentage of
time application support teams actually spend
programming. Yourdan [6] estimates that only 15% of
development time is spent writing code. Structured
methods not only structure the programming process, but
the more time consuming management processes which
are involved in application support, such as meetings,
reporting, documenting, inspecting, testing, and
communicating [13].
Structured methods can also impact the quality and
business value of a system. The structured methods
performed at the beginning of application development
are especially critical for quality and business value.
Estimation provides an early analysis of costs to benefits.
Data and process models, enterprise models, and design
inspections can insure that the system being developed is
the one needed for the business [1]. The role of users in
enterprise modeling and design inspections can result in
increased quality and business value. Code inspections
and other forms of testing can contribute to quality by
insuring delivery of a minimum defect product [14].
3. Method X
The structured methodology used in this study, which
we will call Method X, was purchased by a Fortune 100
manufacturing firm from a vendor. The firm has in
excess of ten million dollars invested in this
methodology. The vendor that developed and provided
this methodology is a major information technology and
business consulting firm.
Method X is a methodology that supports all aspects of
the software development and maintenance process.
Heavy emphasis is put on standardization and
documentation within the methodology. This was one of
the key reasons that the manufacturing firm, with over
10,000 information systems people, purchased this
methodology. The goal was to enable personnel to move
from project to project without a long learning curve.
The company also hoped to improve quality and cost
through the use of Method X, but did not set specific
goals in these areas.
Regular updates and support are provided to the
purchasing company for Method X. In addition, onsite
representatives of the vendor firm assist the company in
implementing and fine tuning the methodology. The
researchers have observed a great deal of trust and
cooperation between these parties in the past five years.
Method X is representative of many large scale
structured methodologies commercially available,
although this particular implementation of the product is
especially well supported due to the large size of the
purchasing firm. We do not claim that the results found
in this study are generalizable to other implementations
of Method X or similar products, but suggest that these
1060-3425/98 $10.00 (c) 1998 IEEE
results can help other researchers in measuring and
interpreting this phenomenon.
4. Measuring structured methodology
efficiency and effectiveness
This study measures the efficiency and effectiveness of
a structured methodology through measures of the
software product and the software maintenance team
performance [15], [16], [17], [18]. Efficiency is captured
by measuring the frequency of design inspections,
configuration management, technical document
updating, and reuse as the application is being
maintained. This information was gathered directly from
the application maintenance teams. Efficiency is also
captured by measuring the perceived speed of changes
produced by the maintainers. This information was
provided by the IS managers and user managers of the
applications studied.
Effectiveness is represented by the cost per function
point of the application, the number of failures per
month, and the number of design, code, and other defects
per month found in the application. These failure and
defect numbers are adjusted for the size of the system.
Effectiveness is also measured subjectively by the
stakeholders of the application, the IS and user
managers. These stakeholders evaluated how well the
system continued to perform when changes were made to
the business process supported by the application. All
perceptive variables are rated on a 1-7 Likert Scale.
The efficiency and effectiveness measures represent
areas where significant differences were found between
the applications using the structured methodology, and
those that did not. Multiple measures were used to
attempt to represent these characteristics. The measures
that emerged as showing differences in performance are
in no way comprehensive, but do represent solid evidence
for the contribution of the structured method to the
efficiency and effectiveness of the applications in
maintenance.
5. Sample and analytical technique
The basic design of this research is a cross-sectional
field study. The original sample of applications was
drawn from twelve organizations. Two years later, after
one of the organizations implemented Method X, an
additional sample of thirty applications representing
those using Method X were studied. The organizations
in the study were chosen for industry diversity, ease of
data collection, and availability of metrics information,
making this a convenience sample. The software
applications in both samples represented a wide variety
of ages, sizes, programming languages, and hardware
platforms. The ages of the systems varied from six
months to thirty years old. The sizes range from 130
function points to over 10,000 function points. The
programming languages range from mainframe COBOL
to client server C++. The hardware platforms the
software systems are running on include mainframes,
workstations, client/server installations, personal
computers, and radio frequency installations. The
analyses performed were controlled for these factors and
tested for heterogeneity of variance.
TABLE 1
PARTICIPATING ORGANIZATIONS
INDUSTRIES REPRESENTED AND # OF
SYSTEMS
Industry Number of
Participating
Software Systems
Manufacturing 35 plus 30
Manufacturing 5
Oil and Gas 5
Human Services 14
Telecommunications 9
Express Shipping 5
Natural Resources 3
Employment Services 5
Transportation 8
Public Comptroller 18
Manufacturing 5
Electric Utility 4
0
10
20
30
40
50
# < 1 YR # 1-3 YRS # >3-5 YRS # >5-10 YRS # >10 YRS
YRS Operating
# of Systems
Figure 1
Age of Systems in Study
1060-3425/98 $10.00 (c) 1998 IEEE
0
20
40
60
80
COBOL
FORTRAN
C
OBJEC...
OTHER
Language
# of Systems
Figure 2
SYSTEM LANGUAGE
(Systems may be written in more than one
language)
0
20
40
60
80
100
MAINFRAME WORKSTATION CLIENT/SERVER PC
PLATFORM
# 0F SYSTEMS
Figure 3
SYSTEM PLATFORM
(Systems may run on than one platform)
0
20
40
60
80
SMALL <300 FP MEDIUM 300-
2000 FP
LARGE >2000
FP
SIZE IN FUNCTION POINTS
# OF SYSTEMS
Figure 4
SYSTEM SIZE
To test the contribution of Method X, objective data
about methods used, frequency of use, failures, defects,
size of system, and cost were gathered directly from the
application maintenance teams. For each represented
software application, subjective performance information
was collected in a survey given to IS managers and user
managers. The response rate was 100%.
One way analysis of variance (ANOVA) was used to
detect if the means in the Method X sample were
significantly different than those in the original sample at
the .05 and .10 confidence levels [19].
6. Results
The most startling result of this study is the large and
significant (.10 level) difference in cost per function
point of the applications using Method X. The annual
average cost per function point of maintaining the thirty
projects using the methodology was $318 versus $1096
for the rest of the projects in the manufacturing
organization and $802 for all of the projects in the study
not using Method X . Tables 2 & 3 show the ANOVA
results for cost per function point and other metrics used
in the study.
Table 2
SOFTWARE MAINTENANCE VARIABLES
ANOVA (one way)
All Organizations
n=116 Non-Methodology
n=30 Methodology
Variable F Sig.
Cost per Function Point 2.93 .09
Use of Design Inspections 22.6
1
.00
Configuration Management 19.6
8
.00
Reuse from Other Applications 4.15 .04
Ability to Migrate Across Platforms 8.70 .00
Updating of Technical Reference
Material
3.86 .05
Continues to Perform Well when
Change Occurs in Business Process
.07 .75
Speed of Changes Performed by
Maintainers
3.86 .05
1060-3425/98 $10.00 (c) 1998 IEEE
Table 3
SOFTWARE MAINTENANCE VARIABLES
ANOVA (one way)
Manufacturing Organization Only
n=35 Non-Methodology
n=30 Methodology
Variable F Sig.
Cost per Function Point 3.81 .06
Use of Design Inspections 20.5
8
.00
Configuration Management .58 .45
Reuse from Other Applications 3.69 .06
Ability to Migrate Across Platforms 5.75 .02
Updating of Technical Reference
Material
8.43 .00
Continues to Perform Well when
Change Occurs in Business Process
5.28 .03
Speed of Changes Performed by
Maintainers
2.06 .16
In the case of design inspections and configuration
management, the results are also somewhat surprising.
While no significant configuration management
differences were found within the manufacturing
organization, they were found when compared to the
entire sample. Significant use of design inspections was
found in both comparisons. The interesting finding is
that the Method X application teams did less design
inspections and were not as likely to use configuration
management. On the other hand, these teams had
significantly better levels of reuse from other systems,
technical documentation, and ability to migrate across
platforms.
The results of the perceptive questions given to IS and
user managers show that the Method X applications had
significantly better performance continuation when there
were business process changes. However, the perceived
speed of changes performed by the maintenance staff is
significantly lower for the Method X teams than those in
the entire sample.
In the area of failures and defects, significant
differences were found between number of design defects
found per month in the Method X medium and large size
projects compared to those in the entire sample. No
significant differences were found in small projects, but
the number of code defects per month in large projects
was significantly less in large projects using Method X
when compared to both the parent manufacturing
company and the entire sample. Tables 4 and 5 show
these results.
Table 4
Failure and Defect Variables ANOVA (one way)
Manufacturing Organization Only
n=35 Non-Methodology
n=30 Methodology
Variable F Sig.
Failures per Month <300 function
points
1.96 .67
Failures per Month 300 – 2000
function points
2.64 .12
Failures per Month >2000 function
points
2.49 .13
Code defects discovered per month <
300 function points
.58 .47
Code defects discovered per month
300 – 2000 function points
1.85 .19
Code defects discovered per month >
2000 function points
6.25
.02
Design defects discovered per month
< 300 function points
.2.7
8
.14
Design defects discovered per month
300 – 2000 function points
3.00 .10
Design defects discovered per month
> 2000 function points
5.54
.03
Other defects discovered per month <
300 function points
.19 .68
Other defects discovered per month
300 – 2000 function points
2.48 .13
Other defects discovered per month >
2000 function points
2.78 .12
1060-3425/98 $10.00 (c) 1998 IEEE
Table 5
FAILURE AND DEFECT VARIABLES
ANOVA (one way)
All Organizations
n=116 Non-Methodology
n=30 Methodology
Variable F Sig.
Failures per Month <300 function
points
.64 .44
Failures per Month 300 – 2000
function points
.13 .72
Failures per Month >2000 function
points
1.97 .17
Code defects discovered per month <
300 function points
.31 .59
Code defects discovered per month
300 – 2000 function points
.74 .39
Code defects discovered per month >
2000 function points
.4.9
2
.03
Design defects discovered per month
< 300 function points
.06 .82
Design defects discovered per month
300 – 2000 function points
1.17 .28
Design defects discovered per month
> 2000 function points
2.41 .13
Other defects discovered per month <
300 function points
.53 .48
Other defects discovered per month
300 – 2000 function points
1.96 .17
Other defects discovered per month >
2000 function points
2.61 .11
7. Discussion
Structured methodology appears to have a significant
cost impact in the case of Method X. The analyses
performed in this research were controlled for application
characteristics such as size, platform, and age. What is
not controlled for, however, are organizational
initiatives such as downsizing or reengineering. During
the two years between the original study and the
implementation and measurement of the Method X
projects, the company studied did reorganize its IS group.
At the same time, many maintainers left the company
due to the increased competition for their skills or early
retirement. However, the lower cost was also evident for
the Method X projects when compared to the entire
sample including eleven companies other than the
parent. This suggests that the implementation of a
structured methodology did at least contribute to these
cost savings.
The findings of reduced levels of design inspections
and configuration management are at first surprising.
Upon deeper analysis of Method X, however, it was
found that these tasks are built-in to the methodology in a
way that so they are relatively seamless and could easily
be missed as separate initiatives. To fully conclude that
this is the case, these applications will need to be studied
in a future time period to obtain longitudinal
confirmation.
The significantly higher level of reuse from other
applications found in the Method X projects indicates
that the strategy of the company for utilizing a structured
methodology is working. One of the reasons for
implementing Method X was to allow maintainers to
easily move from project to project. Since a great deal of
reuse is happening, applications must be relatively
standardized and transferable, and therefore constructed
in a similar fashion.
The significantly higher level of technical
documentation updates also suggests that it will be easier
for personnel to move from project to project. Method X
has a rigorous set of documentation that must be
maintained. Each of these documents is concise and to
the point, but no changes are allowed to the application
without a change in the documentation. It does not
appear that this rigor is extended to either user reference
or user training material, which showed no significant
differences between Method X projects and all projects.
The higher level of ability to migrate across platforms
is partially an artifact of the increased need for this
feature today versus two years ago. However, higher
levels of reuse and documentation should facilitate this
process.
IS and user managers in the manufacturing company
rated the Method X projects significantly higher in
ability to perform after changes in the business process.
This result was not significant in comparison to the
entire twelve organizations. This finding demonstrates
that in addition to objective measures of performance,
management stakeholders are also perceiving improved
performance in applications supported by this structured
methodology. However, these stakeholders also rated the
Method X teams significantly lower in speed of
performing maintenance changes when compared to the
entire sample. This seems contradictory to the cost
results, but in reality may be supportive. It may take
more initial time to insure that documentation is being
updated and that reusable components are being
integrated into the application. It is also likely that once
1060-3425/98 $10.00 (c) 1998 IEEE
the methodology is followed, individual changes may
appear to take more time, but the number of changes is
dropping due to better process control, thereby lowering
costs. The short-term loss of speed is traded off for a
long-term gain in cost effectiveness.
The use of a structured methodology is only having a
significant impact in medium and large projects in the
areas of code and design defects. This may indicate that
structured methods are more appropriate for this size of
projects, or that a learning curve exists based on the size
of application maintained, and that small projects will
see reduced failures and defects sometime in the future.
8. Summary
This study demonstrates that a structured methodology
can be measured for efficiency and effectiveness
contributions to software maintenance. The largest
impact of the single methodology studied was the
significantly reduced annual cost per function point of
maintaining the applications. This result was tempered
with a perceived loss of speed in making changes to the
applications. Other results showed higher levels of reuse
from other projects, ability to migrate between hardware
platforms, and some reduced code and design defects
found per month in medium and large applications.
Significantly reduced amounts of design inspections and
configuration management were found in the structured
methodology applications, but this may be explained by
the integration of these techniques into the methodology
itself.
Future research suggested by this study could include
the development of more complex models based on the
individual factors found to be significant for the single
methodology in this sample. Multiple methodologies
could be compared using such a model. This type of
research is well suited for longitudinal comparisons to
determine if methodologies become institutionalized and
are consistently implemented and enforced over time.
Utilizing a single methodology within a single
organization, this study has compared performance
results for software applications being maintained both
within and outside the home company. These results
suggest to practitioners which performance
characteristics might be appropriate to measure, and how
they can be used to make more profitable decisions
regarding software maintenance.
References
[1] Topper, Andrew, Ouellete, Daniel, and Jorgensen,
Paul, Structured Methods: Merging Models, Techniques,
and CASE, McGraw-Hill, Inc., NY, 1994
[10] Chapin, Ned, "Some Structured Analysis
Techniques", Data Base, Vol. 10, No. 3, Winter 1979,
pp. 16-23
[11] Dolk, Daniel R., "Model Management and
Structured Modeling:, The Role of Information Resource
Dictionary Systems", Communications of the ACM, Vol.
31, No. 6, June 1988, pp. 704-718
[12] Phan, Dien D. et al. 1995. Managing Software
Quality in a Very Large Development, Information &
Management, Vol. 29, Iss. 5, Nov, pp. 277-283.
[13] DeMarco, Tom, Structured Analysis and System
Specification, Yourdan Press, New York, 1978
[14] Chaar, J.K., Halliday, M.J., Bhandari, I.S., and
Chillarege, R., “In-Process Evaluation for Software
Inspection and Test”, IEEE Transactions on Software
Engineering, Vol. 19, No. 11, November 1993, pp. 1055-
1070
[15] Banker, Rajiv D., Datar, Srikant, and Kemerer,
Chris, "A Model to Evaluate Variables Impacting the
Productivity of Software Maintenance Projects",
Management Science, Vol. 37, No. 1, January 1991.
[16] Banker, Rajiv D., Datar, Srikant, and Kemerer,
Chris, "Factors Affecting Software Maintenance
Productivity: An Exploratory Study", Proceedings of the
Eighth ICIS, Pittsburgh, PA, December 6-9, 1987, pp.
160-175.
[17] Jones, Capers, Programming Productivity, McGraw
Hill Book Company, New York, 1986.
[18]Swanson, E. Burton and Beath, Cynthia Mathias,
"Departmentalization in Software Development and
Maintenance", Communications of the ACM, Vol. 33,
No. 6, June 1990, pp. 658-667
[19] Glass, Gene V. and Hopkins, Kenneth D.,
Statistical Methods in Education and Psychology, Allyn
and Bacon, Needham Heights, MA, 1984
[2] Sanders, Joc and Eugene Curran. Software Quality: A
Framework for Success in Software Development and
Support. by, Addison-Wesley 1994
[3] Tung, Sho-Huan, “A Structured Method for Literate
Programming”, Structured Profgramming, Vol.10, No. 2,
1989, pp. 113-120
1060-3425/98 $10.00 (c) 1998 IEEE
[4] Yourdan, Edward, Techniques of Program Structure
and Design, Prentice hall, Englewood, New Jersey, 1975
[5] Yourdan, Edward, Managing the Structured
Techniques: Strategies for Software Development in the
1990's, Yourdan Press/Prentice Hall, New York, 1989
(2)
[6] Yourdan, Edward, Modern Structured Analysis,
Yourdan Press/Prentice Hall, New York, 1989 (1)
[7] Martin, James and McClure, Carma, Structured
Techniques: The Basis For CASE, Prentice Hall,
Englewood Cliffs, N.J., 1988
[8] Bennett, K. H. 1991. Automated Support of
Software Maintenance, Information & Software
Technology, Vol. 33, Iss. 1, Jan/Feb, pp. 74-85.
[9] Gorla, N. 1991. Techniques for Application
Software Maintenance, Information & Software
Technology, Vol. 33, Iss. 1, Jan/Feb, pp. 65-73.
1060-3425/98 $10.00 (c) 1998 IEEE