Content uploaded by Rahul Prem Mohanani
Author content
All content in this area was uploaded by Rahul Prem Mohanani on Sep 09, 2014
Content may be subject to copyright.
Requirements Fixation
Rahul Mohanani, Paul Ralph, and Ben Shreeve
Lancaster University
Lancaster, UK
rahul.mohanani@gmail.com, paul@paulralph.name, ben.shreeve@gmail.com
ABSTRACT
There is a broad consensus that understanding system desiderata
(requirements) and design creativity are both important for
software engineering success. However, little research has
addressed the relationship between design creativity and the way
requirements are framed or presented. This paper therefore aims
to investigate the possibility that the way desiderata are framed or
presented can affect design creativity. Forty two participants took
part in a randomized control trial where one group received
desiderata framed as “requirements” while the other received
desiderata framed as “ideas”. Participants produced design
concepts which were judged for originality. Participants who
received requirements framing produced significantly less original
designs than participants who received ideas framing (Mann-
Whitney U=116.5, p=0.004). We conclude that framing desiderata
as “requirements” may cause requirements fixation where
designers’ preoccupation with satisfying explicit requirements
inhibits their creativity.
Categories and Subject Descriptors
D.2.1 [Software Engineering]: Requirements/Specifications;
D.2.10 [Software Engineering]: Design
General Terms
Documentation, Design, Experimentation, Human Factors
Keywords
Design Creativity, Requirements, Cognitive Bias, Randomized
Controlled Trial
1. INTRODUCTION
It is widely accepted in the software engineering (SE) research
community that understanding system requirements is critical to
designing good systems (cf. [16, 20, 32, 81, 85, 89, 99, 100]).
While some disagree as to whether requirements should be
understood more upfront (e.g. [38, 44]) or as development
progresses (e.g. [5, 14]) most agree that requirements are
important sooner or later. Practitioners similarly treat
requirements-understanding as crucial to system success
especially in outsourcing and tendering contracts [11]. The
dangers of getting requirements wrong or failing to account for
requirements changes are widely recognized [15, 41, 64, 103].
Requirements Engineering (RE) research has consequently
investigated techniques for eliciting, analyzing, modeling and
communicating requirements.
How a situation is framed, however, can have powerful effects on
the cognition and performance of human participants (see below).
Some (e.g. [78]) have suggested that framing the context of a
software development project in terms of requirements may
deleteriously affect design creativity. Specifically, misperceiving
all ‘requirements’ as compulsory may interfere with design space
exploration. Yet, little research has empirically investigated the
effects of presenting desiderata as “requirements” on design
creativity. We therefore propose the following research question.
Research Question: Does framing desiderata as
“requirements” negatively affect creativity in design
concept generation?
Here, a desideratum is “something for which a desire or longing is
felt; something wanting and req uir ed or desired” [66].
Requirement, meanwhile, has been defined in several ways
including “a statement that identifies a capability or function that
is needed by a system in order to satisfy its customer’s needs” [4]
and “a property that must be exhibited in order to solve some
problem in the real world” [9]. Others emphasize that
requirements state “what a system is supposed to do, as opposed
to how it should do it” [102]. Here, framing therefore refers to
how the desiderata are presented or communicated; e.g., a list of
“the system shall...” requirements [40]; a backlog of user stories
[87]; a set of use case narratives [23]. In other words, when a
desideratum is presented as mandatory, it is being framed as a
requirement. Meanwhile, design concept generation refers to
informally specifying one or more ideas for a software artifact.
Designers often use sketching [69], storyboarding [95] or other
informal modeling techniques to specify design concepts.
We theorize that framing desiderata as “requirements” will lead to
less creative designs. Specifically, we suspect that the high
importance and confidence connoted by the term requirement
shuts down designer’s creative processes by promoting the view
that the problem is well-understood and already largely solved.
The paper therefore proceeds by reviewing existing literature on
fixation and requirements engineering (§2). Next we propose the
concept “requirements fixation” and describe the methodology
(§3) and results (§4) of a laboratory study of this concept. Section
5 discusses the interpretation and implications of the findings and
the Section 6 concludes the paper with a summary of its
contributions and suggestions for future research.
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full
citation on the first page. To copy otherwise, or republish, to post on servers or to
redistribute to lists, requires prior specific permission and/or a fee.
ICSE'14, May 31 – June 7, 2014, Hyderabad, India
Copyright is held by the owner/author(s). Publication rights licensed to ACM.
ACM 978-1-4503-2756-5/14/06$15.00.
http://dx.doi.org/10.1145/2568225.2568261
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. Copyrights for components of this work owned by others than ACM
must be honored. Abstracting with credit is permitted. To copy otherwise, or republish,
to post on servers or to redistribute to lists, requires prior specific permission and/or a
fee. Request permissions from Permissions@acm.org.
Copyright is held by the author/owner(s). Publication rights licensed to ACM.
ICSE’14, May 31 – June 7, 2014, Hyderabad, India
ACM 978-1-4503-2756-5/14/05
http://dx.doi.org/10.1145/2568225.2568235
895
2. LITERATURE REVIEW
2.1 Fixation
Cognitive biases are systematic deviations from optimal reasoning
[80, 91] or psychological phenomena that “prejudice decision
quality in a significant number of decisions for a significant
number of people” [3]. Many cognitive biases have been
discovered – Arnott alone reviews 37 [3]. Previous research has
investigated the role of myriad cognitive biases in software
engineering generally (e.g. [67, 80, 90]) and requirements
engineering specifically (e.g. [12]). The current study is concerned
primarily with two biases – fixation and framing effects.
Fixation was originally proposed by Freud in reference to unusual
sexual traits; however, its meaning has since broadened to refer to
the tendency to “disproportionately focus on one aspect of an
event, object, or situation, especially self-imposed or imaginary
obstacles” [80]. People can fixate on myriad objects and
properties; e.g., the color of a car, the presence of a spider, the
placement of a button. Fixating on one aspect of something
usually implies marginalizing other aspects; e.g., fixating on a
software system’s speed while ignoring its aesthetics, or vice
versa.
Jansson and Smith proposed the concept design fixation – “a blind
adherence to a set of ideas or concepts limiting the output of
conceptual design” [42]. In a series of experiments on design
fixation, participants were asked to design a bicycle rack for a car,
a measuring cup for the blind, and a spill-proof coffee cup. In
each case, participants were divided into two groups and the
treatment group was given a flawed example design. Treatment
groups consistently produced more designs that mimicked
negative aspects of the flawed example design. For instance, in
the measuring cup study, the treatment group “generated more
non-infinitely variable designs than the control group, more
designs without overflow devices, and more overall designs
similar to the example” [42].
Numerous replications, extensions and variations on Jansson and
Smith’s methodology have since been published, with myriad
results. For instance, propensity for fixation varies by domain;
e.g., mechanical engineers fixate more than industrial designers
[70-73]. When designers are given good, rather than intentionally
flawed, examples, they still fixate on the example but they
produce higher-quality designs than designers without examples.
[49]. Presenting designers with common ideas (or examples)
produces more fixation than than unusual examples [68]. Some
evidence suggests that fixation can be mitigated by “defixating”
instructions [21], i.e., instructions to avoid problematic features of
given examples. Some evidence suggests that physical
prototyping reduces design fixation [101]. Meanwhile in software
engineering, some have suggested that inconsistency in software
specifications may reduce premature commitment [65].
Moreover, one would expect (fixated) treatment groups to produce
fewer designs than control groups. However, no evidence of such
a relationship has been found [42, 49, 70-73]. It is not clear
whether this results from the nature of fixation and creativity or
from the time limits imposed on participants in these studies.
In summary, numerous studies have found that designers fixate on
given examples in laboratory settings. The relationship between
fixation and solution quality is moderated by example quality. The
relationship between fixation and creativity is moderated by the
domain (e.g., mechanical engineering), task (e.g. physical
prototyping) and the framing of the examples (e.g. de-fixating
instructions). This last point highlights the relationship between
fixation and framing.
2.2 Framing Effects
Another cognitive bias, related to fixation, is the framing effect.
The “framing effect is the tendency to give different responses to
problems that have surface dissimilarities but that are really
formally identical” [91]. For example, in one experiment,
participants were asked to choose between two treatments for a
hypothetical disease – Treatment A would save 200 of 600 people;
Treatment B had a 1 in 3 chance of saving everyone and and 2 in
3 chance of saving no one. When participants were asked to
choose between 400 people definitely dying or a 1 in 3 chance
that no one will die, most chose the latter. However, when
participants were asked to choose between definitely saving 200
people or a 1 in 3 chance of saving everyone, most chose the
former [96]. Here, the difference in responses is entirely
attributable to the way the question is presented (or framed) rather
than the underlying structure of the treatments.
The framing effect is extremely robust [7, 19, 47, 97]; i.e., it
applies to many people across diverse circumstances. Levin et al.
[47] identified three types of things framed in existing studies –
individual attributes, goal statements and risk profiles (as in the
hypothetical disease, above). Of course, framing effects have been
leveraged in marketing and advertising for many years [46],
which explains why notebook computers are described as “1.7cm
thin”. However, framing effects are somewhat mitigated by task
involvement [51, 83].
When researchers (e.g. [21]) investigate fixation by giving
participants different instructions, the independent variable may
be considered task framing and the results (fixation) may be
considered a kind of framing effect. Of course, fixation is not
inherently a framing effect – people can become fixated without
intervention. However, most existing studies of fixation leverage
framing effects. Consequently, the primary conclusion of design
fixation research may be reconceptualized from designers fixate
on given examples to task framing causes fixation.
In software engineering however, developers are more often asked
to design a system based on some sort of requirements
specification. This raises the question, can the framing of a
requirements specification lead to fixation and reduce creativity?
2.3 Requirements, Design and Creativity
Like Software Engineering, Requirements Engineering (RE)
simultaneously refers to a collection of activities and the academic
field that studies those activities. While no single definition of
requirements engineering is widely accepted, the activities in
question include understanding, specifying, documenting,
communicating and modifying problems, needs, wants and other
desiderata [15, 18]. RE may also focus on goals [2, 28, 98], users
[57, 93], agents [10] and non-functional properties [22, 37, 60].
RE is also concerned with analyzing and predicting numerous
properties of desiderata, including stability [13, 40].
RE is often portrayed as relatively independent from designing
and implementing software systems. Many authors (e.g. [4, 102])
emphasize that RE primarily concerns determining what the
software should do rather than how it should do it. The
interdisciplinary design literature, in contrast, emphasizes that
problem framing and solution generation are fundamentally the
same cognitive process [27, 86]. Specifically, empirical studies of
expert designers reveal that designers rapidly oscillate between
their understanding of the context and ideas for design candidates,
simultaneously revising both – a process sometimes called
coevolution [30, 50]. Some research has also explored coevolution
in software engineering [74, 77, 79].
896
Coevolution is closely related to creativity [36], which is
increasingly recognized as important within RE. Some now argue
that RE is an inherently creative act [52-55]. For example, Maiden
et al. argue that “requirements are the key abstraction that
encapsulates the results of creative thinking about the vision of an
innovative product” [55] and that RE processes may be improved
by integrating creativity techniques [53]. Producing truly
innovative products entails inventing requirements no client or
user may think of [82]. Field research clearly suggests that RE is
creative and opportunistic in practice [62]. Four trends are driving
this need to incorporate creativity into requirements processes – 1)
the strategic importance of creativity for competitive advantage;
2) the increasing diversity of devices and applications; 3) the
increasing acceptance of coevolution in (particularly Agile)
systems development methods; 4) the increasing interest in
creativity among requirements practitioners [54].
Recent research on requirements creativity straddle the gulf
between the two design paradigms [29, 31, 76]. In the rational
paradigm (which dominates engineering), clients have
requirements, analysts elicit those requirements and developers
search for solutions that satisfy requirements [11]. Contrastingly,
in the alternative paradigm (which dominates product and
industrial design) designers are faced with problematic situations
characterized by goal disagreement [17] and few definitive
requirements [78]; designers simultaneou sly refine their
understandings of the context and solution space, often exploring
the context by generating design concepts [74]. The rational
paradigm traditionally downplays the importance of creativity
since design is presented as heuristic search of a known,
constrained solution space [88]. Meanwhile the alternative
paradigm traditionally emphasizes the importance of creativity for
good design.
Moreover, creativity itself is the focus of much research in myriad
fields including psychology, sociology, management and
education. While a comprehensive review is beyond the scope of
this paper, several points warrant discussion. First, no single
definition of creativity is widely accepted; however, there is broad
consensus that creativity is a cognitive process that produces
novel and useful ideas [59]. Some distinguish between creativity
from different perspectives – the individual designer’s (p-
creativity), the specific design context’s (s-creativity) or society’s
(h-creativity) [6, 94]. Creativity depends on cognitive skills, risk
tolerance, domain specific knowledge and many situational
factors [1]. It also relates to social context in that it involves
deviating from social norms and structures [26]. Creativity is
related to but separate from intelligence [61, 92].
Creativity is often linked to divergent thinking [39], i.e., exploring
many possible solutions to a problem rather than deriving a single
correct answer. However, divergent thinking is not equivalent to
creativity as the former entails seeing many possibilities but not
necessarily creating anything novel or useful [84]. Rather,
creativity involves generating many ideas, some better than
others, and then effectively identifying the best.
3. RESEARCH METHODOLOGY
This section describes an exploratory experiment to investigate
the relationship between desiderata framing (the independent
variable) and the originality of design concepts (the dependent
variable). Briefly, participants were given a list of desiderata
framed as either “requirements” or “ideas” and asked to generate
design concepts. We then rated the design concepts for originality.
If one group produced substantially more original designs, it
would suggest that desiderata framing affects creativity.
3.1 Theorizing Requirements Fixation
During a previous study, the second author observed a team
developing a mobile application. The client provided a quite
rudimentary and generally poor design concept with instructions
to essentially ‘build something like this’. Rather than question or
try to improve this simple specification, the developers appeared
to take it for granted and began coding. After an intervention, the
team recognized the design flaws, threw away the specification
and designed the system from scratch. The client subsequently felt
that the new design was a major improvement.
This led us to theorize that, in some cases, software developers are
sensitive to requirements fixation: the t e nden c y to
disproportionately focus on desiderata that are explicitly framed
as requirements. Symptoms of requirements fixation may include:
• failing to question dubious desiderata
• perceiving desiderata as having equal (high) importance
• perceiving all desiderata as having equal (high) confidence
• failing to consider the relationship between desiderata and
overall goals
• failing to notice conflicting desiderata
• failing to notice desiderata ambiguity
• failing to consider implicit or non-functional desiderata
Like design fixation, requirements fixation is clearly too complex
to evaluate holistically in a single study. We therefore begin by
examining the relationship between the framing of desiderata and
the originality of design concepts using an experimental design
analogous to previous design fixation experiments (§2.1).
3.2 Hypothesis
We hypothesize that design concept originality will be lower when
desiderata are framed as “requirements” than when they are
framed as “ideas”. When desiderata are framed as “requirements”,
we expect participants to perceive the desiderata as complete,
certain or fixed, triggering fixation and reducing creativity. In
contrast, when desiderata are framed as “ideas” we expect
participants to perceive the desiderata as incomplete, uncertain or
flexible. This conceptualization should trigger more creative
thinking, which should lead to more innovative, creative designs.
Consequently, our hypothesis is as follows.
Hypothesis H0: Participants who receive desiderata
framed as “requirements” will produce neither more nor
less creative design concepts than participants who
receive desiderata framed as “ideas”.
Hypothesis H1: Participants who receive desiderata
framed as “requirements” will produce less creative
design concepts than participants who receive desiderata
framed as “ideas”.
3.3 Participants
Participation was solicited from post-graduate students enrolled in
management and engineering programs at the authors’ university
using relevant student mailing lists. A convenience sample of 42
participants was selected – 19 female and 23 male, with a mean
age of 25 years (standard deviation 6.067). All participants had at
least 1 year of professional design experience, with 14 coming
from a software engineering background. None of the participants
had design experience in the particular field of the task, i.e.,
mobile applications. Participants received no financial
compensation but a complementary lunch was provided.
3.4 Experimental Design
A between-subjects randomized controlled trial was chosen for
this study. Participants were randomly assigned to one of two
897
equally-sized groups – Group A and Group B. Group A and Group
B completed the study in separate but very similar rooms, each
with a single invigilator. The invigilator distributed materials, read
the instructions and collected the completed templates. The
invigilator did not answer any questions.
The directions differed between the groups in exactly two ways.
First, Group A’s opening paragraph read:
“For this study your task is to develop one or more design
concepts for a mobile application to encourage healthy
living. A design concept is a high-level description of a
system. To help, an analyst has conducted several focus
groups around campus and produced the following
requirements specification.” (italics added)
While Group B’s opening paragraph read:
“For this study your task is to develop one or more design
concepts for a mobile application to encourage healthy
living. A design concept is a high-level description of a
system. To help, an analyst has conducted several focus
groups around campus and produced the following list of
ideas.” (italics added)
Neither “ideas” nor “requirements” were defined so as to retain
participants’ natural preconceptions and biases. These directions
were followed by a list of 24 desiderata for the app. Both groups
received the same desiderata in the same order. However, Group
A’s desiderata began with “the system shall” (consistent with
IEEE-830 [40]) while Group B’s desiderata began with “the
system might”. No other differences between the two groups were
introduced.
The desiderata themselves were written by the authors based on
features of existing health-related apps. The idea was to create a
realistically imperfect spec (cf. [4]) – the kind of jumble of ideas
that might be written by unsophisticated client or a programmer
with little RE training, rather than the polished work of an expert
requirements analyst. In other words, we tried to make a
document representative of what we have observed in previous
field work and professional practice. The desiderata (minus the
“the system shall / might” prefix) were as follows.
• play music
• reduce stress
• recommend activities
• recommend diet foods
• measure calorie intake
• facilitate diet planning
• analyze sleeping habits
• users share their experiences
• allow the user to plan workouts
• be user friendly and easy to use
• be technically stable and not crash
• track BMR (Basal Metabolic Rate)
• track what the user eats and drinks
• be compatible with iOS and Android
• count calories burned during workouts
• help the user stick to planned workouts
• recommend recipes based on user goals
• share user accomplishments on Facebook
• suggest ‘power foods’ based on my BMR
• connect the user to a doctor in an emergency
• track speed and distance for running, swimming, etc.
• provide instruction for diverse exercises and activities
• retain workout history and provide performance analysis
• recommend specific workouts at varying levels of difficulty
In addition to the desiderata, participants were given a conceptual
design template comprising several sheets of paper with blank
mobile-screen-sized boxes in landscape and portrait views and
adjacent space for explanations (Figures 1 and 2). Participants
could use as many templates as needed.
Participants had 60 minutes in which to complete their designs.
The invigilator then distributed a post-task questionnaire which
recorded demographic and contact information. All questions
were optional. The post-task questionnaire also included a
manipulation check where participants were asked to indicate the
importance of the desiderata in guiding their conceptual designs
on a five-point scale.
3.5 Grading
Two expert judges (the first and third authors) independently
graded the conceptual designs. Prior to grading, the conceptual
designs were anonymized, combined into a single set and
randomly shuffled such that the judges knew neither the
participant nor the group to which each design belonged. To keep
the evaluation as simple (and robust) as possible, there was no
complicated rubric for evaluating designs; rather, judges used a
simple five-point scale where a 1 indicates low originality and a 5
indicates high originality. We felt that a more granular scale would
lead to overprecision [58]. Here, originality refers to creativity
from the perspective of society, or h-creativity [6].
The judges discussed and marked the first 3 designs together to
establish a shared baseline. For example, in Figures 1 (from the
“ideas” group) and 2 (from the “requirements” group) we can see
how two different participants implemented diet tracking - one
using a simple written description while the other attempts to
quantify calories. The week-level overview (Figure 1) and ability
to share pictures of food (Figure 2) were considered especially
innovative. Figure 2 is notably more complex, feature-rich and
messy while Figure 1 is simpler and cleaner but with fewer
features.
The judges then marked the remaining designs separately (in
different rooms). As these ratings are subjective judgments,
reliability may be examined by calculating inter-rater agreement.
The judges agreed on 34 of the 42 conceptual designs. Using
Cohen’s Kappa [24], this gives an inter-rater agreement of 0.67,
which represents “substantial agreement” and therefore reasonable
reliability [45]. Disagreements were resolved by a third expert
judge (the second author) to create the grade data set used in the
analysis below.
4. RESULTS AND ANALYSIS
To test Hypothesis H1, we need to compare the distributions of
originality grades (Table 1). Ideally we would test Hypothesis H1
with an efficient, parametric test such as an independent samples
t-test or (equivalently) a one-way analysis of variance. However,
these tests assume a normal distribution and homogeneity of
variance. Visual inspection of Figure 3 suggests that Grades may
not satisfy these assumptions. Homogeneity of variance may be
analyzed using Levene’s test, the Brown-Forsythe test and
Levene’s non-parametric test [25]. As none of the three tests
rejected the null hypothesis (Table 2) we assume that grades
meets the homogeneity of variance assumption. However, analysis
using the Shapiro-Wilk test confirms we cannot safely assume that
Grades is normally distributed (p < 0.001 for Group A; p = 0.003
for Group B).
898
Figure 1: Example conceptual design (Group A, landscape template)
899
Figure 2: Example conceptual design (Group B, portrait template)
The combination of non-normal distribution and homogeneity of
variance suggests using the Mann-Whitney U test. Mann-Whitney
makes four assumptions, all of which are met:
1. The dependent variable should be measured on an ordinal
or an interval level.
2. The independent variable should consist of two
categorically independent groups.
3. A subject in each group should maintain absolute
exclusivity and should not be subjected to treatment in
another group.
4. The dependent variable exhibits homogeneity of variance.
900
Table 1: Grades Frequency
Grade
Group A (“Requirements”)
Group B (“Ideas”)
1
1
2
2
7
1
3
12
6
4
0
10
5
1
2
Mean
2.67
3.43
Median
3
4
Figure 3: Grade Distribution across Groups
Table 2: Grades - Homogeneity of Variance
Levene’s test
p = 0.183
Brown-Forsythe test
p = 0.354
Levene’s non-parametric test
p = 0.089
Based on Mann-Whitney analysis we reject the null hypothesis
(U=116.500, n=42, p=0.004). In other words, participants who
received the “ideas” framing produced more creative designs by a
statistically significant margin. Effect size (r) for the Mann-
Whitney U test is calculated using Equation 1 where ‘n’ is the
total number of samples. The result (r = 0.428) indicates a
medium-high effect [33].
Equation 1:
4.1 Exploratory Analysis of Fixation
Above, we theorized that framing desiderata as “requirements”
would increase designers’ propensity for fixation. While deep
insight into the cognitive mechanisms underlying fixation would
necessitate a different kind of study (e.g. a think-aloud protocol
study), we included a simple indicator of fixation in the post-task
questionnaire to facilitate some exploratory analysis. The question
read “How important was the list of specifications in guiding your
design?” and participants responded on a five-point scale from 1
(low) to 5 (high). We would expect participants in Group A to give
the specification higher importance ratings than participants in
Group B do. We also expect the importance placed on the
specifications to be inversely related to originality.
First, Group A reported higher average importance of
specification than group B (Table 3; Figure 4). This difference
appears significant (Mann-Whitney U test, p=0.011; unequal
variances t-test, p=0.006). However, results should be interpreted
Group A Group B
0
4
8
12
12345
Frequency
Grades
with caution as the data does not exhibit a normal distribution (as
assumed by the t-test) or homogeneity of variance (as assumed by
Mann-Whitney).
Table 3: Importance of Specification Frequency
Importance
Group A (“Requirements”)
Group B (“Ideas”)
1 (low)
0
3
2
1
5
3
2
3
4
12
6
5 (high)
6
4
Mean
4.10
3.14
Median
4
3
Figure 4: Grade Distribution across Groups
Second, participants who rated the importance of the specification
as high (4 or 5 out of 5) produced less creative design concepts,
on average (Table 4, Figure 5). The difference between these
distributions is marginally significant (Mann Whitney U test,
p=0.059).
In summary, exploratory analysis of fixation data suggest that
framing desiderata as “requirements” increases fixation and that
increased fixation may inhibit creativity. However, the evidence
for fixation is not as strong as the evidence that requirements
framing leads to less original design concepts and the statistical
analysis of fixation presented here should be interpreted with
caution.
Table 4: Grades Frequencies by Importance of Specification
Importance of Specification
Importance of Specification
Importance of Specification
Grade
Low (1-3)
High (4-5)
Total
1 (low)
0
3
3
2
2
6
8
3
5
13
18
4
5
5
10
5 (high
2
1
3
Mean
3.5
2.8
3.0
Median
3.5
3
3
Group A Group B
0
4
8
12
12345
Frequency
Importance of Specification
901
Figure 5: Grades by Importance of Specification
5. DISCUSSION
The above results strongly support a significant relationship
between the way desiderata are framed and the originality of
design concepts produced. In other words, simply using the terms
“requirements specification” and “shall” may reduce design
creativity.
Ralph [78] argued that requirements are a socially constructed
illusion. Specifically, in many projects, success is not clearly
understood; therefore, no one knows with any certainty which
desiderata are necessary for success. Moreover, software teams do
not have access to highly-experienced, well-trained requirements
analysts. Therefore, real requirements documents often contain a
mixture of goal statements, necessary conditions for success,
meaningful but optional desiderata, design decisions and junk,
i.e., desiderata that are irrelevant to actual goals. In practice,
however, it may be impossible to differentiate necessary
conditions for success from junk. Consequently, RE in practice is
concerned not only with requirements but also with goals,
desiderata that do not clearly relate to goals, high-level design
decisions, obvious junk, non-obvious junk and generally
understanding the domain.
Being potentially illusory does not necessarily imply that
requirements are detrimental. Rather, the danger comes from a
misalignment between the epistemic uncertainty of the desiderata
and the skepticism of the designer. If requirements are highly
certain but the designers treat them as dubious, they may produce
artifacts that lack needed features. However, if requirements are
uncertain but designers treat them as definitive, they may produce
non-innovative artifacts.
While expert designers habitually treat given desiderata
skeptically regardless of how they are presented [27] our study
suggests that experienced (but not expert) designers are sensitive
to the framing of desiderata. Specifically they are more likely to
treat desiderata as more definitive if they are framed as
“requirements” and more skeptically if they are framed as “ideas”,
with the latter leading to more creative designs. We theorize that
this is related to mental-set fixation, where a practitioner restricts
the use of his abilities (creativity) due to a situationally induced
bias [42, 48]. As mental-set fixation is situationally-induced,
altering requirements practices including framing may be effective
in reducing fixation.
One unexpected difference between this study and similar studies
of design fixation was in the number of design concepts produced
by participants. Like previous studies, participants were asked to
produce as many design concepts a possible. Unlike previous
studies, however, all 42 participants in this study produced exactly
High Low
0
5
10
15
12345
Frequency
Grade
one design concept. As we did not notice this behavior until after
the task was completed, we can only assume that either the
directions were unclear or the time was insufficient to allow for
multiple design concepts.
Our findings should be interpreted in light of several limitations.
The present study was devised to determine whether desiderata
framing affects design creativity. While our exploratory analysis
suggests that requirements fixation may be involved, further
research is need to determine exactly how framing relates to
design (see below). Moreover, as participants were not randomly
sampled from a population, statistical generalization of results is
not possible. Our participants’ behavior may not generalize to
experts. As we intentionally used a somewhat disorganized list of
desiderata, some but not most of which may be necessary
conditions for success, our results may not generalize to highly
refined specifications or other kinds of models (e.g. use cases).
Furthermore, this study focused on design concept originality,
which is not equivalent to design concept quality and does not
necessarily lead to original or high-quality implementations.
Finally, the artificial setting in which the study took place may
produce different dynamics than real software projects.
These limitations notwithstanding, our findings concerning
requirements fixation and the relationship between desiderata
framing and creativity have numerous implications for SE
research, practice and education.
Previous research on design fixation has examined how providing
designers with example solutions reduces their creativity. This
study extends this stream of research by demonstrating that, even
without examples, the framing of desiderata can negatively impact
designers’ creativity. This suggests at least three future research
possibilities. First, while RE research traditionally focuses on the
quality of requirements specifications, the presentation of
desiderata also appears important. Presentation issues include not
only modeling techniques (e.g. use cases, scenarios, goal models,
agent models, IEEE-830 style “the system shall” statements) but
also, as demonstrated here, the language used to convey them.
Second, while RE research traditionally focuses on distinguishing
mandatory desiderata (needs) and optional desiderata (wants), the
epistemic status of desiderata appears equally important. That is,
RE may benefit from techniques for indicating the epistemic
status of a desideratum, e.g., we are 80% certain that the system
will need to support encryption. Third, SE more generally may
benefit from more research on debiasing (including de-fixating)
developers and other software project actors. While debiasing is
notoriously difficult [34], psychological research on epistemic
rationality (calibration of belief to evidence [91]) may help. More
generally, the use of the term requirements in the academic
discourse may be over-rationalizing and oversimplifying the
diversity of possible desiderata. Therefore, the RE vernacular may
be obscuring innate disagreement and ambiguity in software
projects, leading to inaccurate theories and ineffective methods.
For practitioners, our results suggest that the term requirement
may curtail innovation independent of the requirements
specifications themselves. If innovative solutions are preferred,
desiderata should be framed to induce skepticism. While this
study used a list-of-ideas framing, we do not advocate simply
renaming “requirements” to “ideas” – the ideas language was
chosen simply to minimize the difference between the two groups.
Rather, we suggest the practitioners more generally consider two
properties of each desideratum – importance/priority [43] and
confidence [8, 56, 63]. Importance refers to how crucial a
desideratum is for success. Confidence refers to the certainty of
the desideratum’s relevance. We suggest that non-expert designers
interpret requirement as implying both high importance and high
902
certainty. To promote innovation, the term requirement should
therefore be reserved for desiderata that have high importance and
high certainty. Based on our results, we can only recommend
presenting less certain and less important desiderata in a manner
that promotes skepticism and is appropriate to the particular
context. However, this raises numerous questions for future
research including how do priority and confidence metadata affect
fixation and creativity? Moreover, we wonder about the mixed
signals of giving a desideratum low confidence or low importance
and still labeling it a requirement. While RE has increasingly
recognized the ambiguity and volatility of desiderata in many
do mai ns , pr act ition ers con ti nue to e xhi bit (o r fe ign )
overconfidence in “requirements”. This paper highlights the
potential adverse effects of this overconfidence on innovation.
Similarly, software engineering education continues to present
over-rationalized and oversimplified views of RE and design. The
IEEE/ACM official model curriculum for undergraduate degrees
in software engineering barely mentions design concept
generation [75]. The notion that analysts ‘elicit’ requirements and
designers translate those requirements into a system design is
simply misleading. SE education should incorporate more training
in creativity techniques, more realistically ambiguous projects and
generally stop presenting deeply oversimplified views of software
development. At the very least, students should be exposed to
realistically imperfect requirements specifications and the need to
distinguish legitimate requirements from junk requirements.
6. CONCLUSION
In summary, this paper investigated the question, does framing
desiderata as “requirements” negatively affect creativity in design
concept generation? The results of our exploratory experimental
study strongly suggest that, yes, simply using the terms
requirements and shall can deleteriously affect designers’
creativity. This highlights the potential power of minor changes in
vernacular and the sensitivity of designers to cognitive biases
including framing effects.
Building on previous research on design fixation, we propose the
concept of requirements fixation, i.e., disproportionate focus on
explicit desiderata framed as requirements. While previous
research (above) has demonstrated that designers may fixate on
the features of given example designs, our research suggests that
designers may also fixate on given desiderata. Like design
fixation, requirements fixation may be mitigated by highlighting
specific problems or the overall epistemic uncertainty surrounding
given information.
More research is needed to clarify the relationship between
desiderata framing, fixation and creativity. For example, fixation
could be more directly demonstrated by an experiment comparing
the creativity of designers giv en a goal and a set of
“requirements” to a control group given only a goal. Moreover,
think-aloud protocol studies [35], where participants explicate
their thinking during a task through continuous speech, may
provide insight into the cognitive mechanisms that mediate the
framing-creativity relationship. Replications with novice and
expert designers and confirmatory field studies are also needed.
More generally, future studies may investigate related cognitive
biases including anchoring, overconfidence and miserly
information processing in software engineering contexts, not to
mention approaches for debiasing participants.
In conclusion, this study highlights an innate tension between
innovating, which comes from new ways of seeing the world, and
satisfying explicit requirements, which are often rooted in a
contemporary worldview. Meanwhile, despite all of the problems
with requirements in principle and requirements specifications in
practice, many researchers and practitioners continue to pretend
that meeting requirements is the only, or at least the primary,
dimension of software engineering success.
7. ACKNOWLEDGMENTS
Thanks are due to all of the participants and to the Lancaster
University Management School for its financial support.
8. REFERENCES
[1] Amabile, T.M. 1996. Creativity in context: Update to “The
Social Psychology of Creativity.” Westview Press.
[2] Anton, A. 1996. Goal-Based Requirements Analysis.
Proceedings of the International Conference on
Requirements Engineering (Colorado Springs, Colorado,
USA).
[3] Arnott, D. 2006. Cognitive biases and decision support
systems development: a design science approach.
Information Systems Journal. 16, 1, 55–78.
[4] Bahill, A.T. and Dean, F.F. 2009. Discovering system
requirements. Handbook of Systems Engineering and
Management. A.P. Sage and W.B. Rouse, eds. John Wiley &
Sons. 205–266.
[5] Beck, K. 2005. Extreme programming eXplained: Embrace
change. Addison Wesley.
[6] Boden, M.A. 2003. The creative mind: Myths and
mechanisms. Routledge.
[7] Bohm, P. and Lind, H. 1992. A note on the robustness of a
classical framing result. Journal of Economic Psychology.
13, 2 (Jun.), 355–361.
[8] Boness, K., Finkelstein, A. and Harrison, R. 2011. A method
for assessing confidence in requirements analysis.
Information and Software Technology. 53, 10 (Oct.), 1084–
1096.
[9] Bourque, P. and Dupuis, R. eds. 2004. Guide to the software
engineering body of knowledge (SWEBOK). IEEE Computer
Society Press.
[10] Bresciani, P., Perini, A., Giorgini, P., Giunchiglia, F. and
Mylopoulos, J. 2004. Tropos: An Agent-Oriented Software
Development Methodology. Autonomous Agents and Multi-
Agent Systems. 8, 3, 203–236.
[11] Brooks, F.P. 2010. The Design of Design: Essays from a
Computer Scientist. Addison-Wesley Professional.
[12] Browne, G.J. and Ramesh, V. 2002. Improving information
requirements determination: a cognitive perspective.
Information & Management. 39, 8, 625–645.
[13] Bush, D. and Finkelstein, A. 2003. Requirements stability
assessment using scenarios. Proceedings of the 11th
International Requirements Engineering Conference. IEEE
Comput. Soc. 23–32.
[14] Cao, L. and Ramesh, B. 2008. Agile Requirements
Engineering Practices: An Empirical Study. IEEE Software.
25, 1, 60–67.
[15] Chakraborty, A., Baowaly, M.K., Arefin, A. and Bahar, A.N.
2012. The Role of Requirement Engineering in Software
Development Life Cycle. Journal of Emerging Trends in
Computing and Information Sciences. 3, 5.
[16] Charette, R.N.S.I. 2005. Why software fails. Spectrum,
IEEE. 42, 9.
[17] Checkland, P. 1999. Systems Thinking, Systems Practice.
Wile y.
903
[18] Cheng, B.H. and Atlee, J.M. 2007. Research directions in
requirements engineering. Proceedings of the Workshop on
the Future of Software Engineering (May), 285–303.
[19] Cheng, F.-F. and Wu, C.-S. 2010. Debiasing the framing
effect: The effect of warning and involvement. Decision
Support Systems. 49, 3, 328–334.
[20] Chow, T. and Cao, D.-B. 2008. A survey study of critical
success factors in agile software projects. Journal of Systems
and Software. 81, 6, 961–971.
[21] Chrysikou, E.G. and Weisberg, R.W. 2005. Following the
Wrong Footsteps: Fixation Effects of Pictorial Examples in a
Design Problem-Solving Task. Journal of Experimental
Psychology: Learning, Memory, and Cognition. 31, 5, 1134–
1148.
[22] Chung, L. and Leite, J.C.P. 2009. On Non-Functional
Requirements in Software Engineering. Conceptual
Modeling: Foundations and Applications: Essays in Honor
of John Mylopoulos. Springer-Verlag. 363–379.
[23] Cockburn, A. 2000. Writing Effective Use Cases. Addison-
Wes l ey.
[24] Cohen, J. 1960. A Coefficient of Agreement for Nominal
Scales. Educational and Psychological Measurement. 20,
37–46.
[25] Conover, W.J., Johnson, M.E. and Johnson, M.M. 1981. A
comparative study of tests for homogeneity of variances,
with applications to the outer continental shelf bidding data.
Technometrics. 23, 4, 351–361.
[26] Cropley, A. 2006. Creativity: A social approach. Roeper
Review. 28, 3, 125–130.
[27] Cross, N., Dorst, K. and Roozenburg, N. 1992. Research in
design thinking. Delft University Press.
[28] Dardenne, A. and Lamsweerde, A. 2010. Goal-directed
Requirements Acquisition. Science of Computer
Programming. 20, 3–50.
[29] Dorst, K. 1997. Describing Design: A comparison of
paradigms. Delft University of Technology. PhD
dissertation.
[30] Dorst, K. and Cross, N. 2001. Creativity in the design
process: Co-evolution of problem-solution. Design Studies.
22, 425–437.
[31] Dorst, K. and Dijkhuis, J. 1995. Comparing Paradigms for
Describing Design Activity. Design Studies. 16, 2, 261–274.
[32] Emam, El, K. and Koru, A.G. 2008. A replicated survey of
IT software project failures. IEEE Software. 25, 5, 84–90.
[33] Field, A. 2009. Discovering statistics using SPSS. Sage
publications.
[34] Fischoff, B. 1982. Debiasing. Judgment under uncertainty:
heuristics and biases. D. Kahneman, P. Slovic, and A.
Tversky, eds. Cambridge University Press.
[35] Fonteyn, M.E., Kuipers, B. and Grobe, S.J. 1993. A
Description of Think Aloud Method and Protocol Analysis.
Qualitative Health Research. 3, 4, 430–441.
[36] Ford, C.M. 2002. The futurity of decisions as a facilitator of
organizational creativity and change. Journal of
Organizational Change Management. 15, 6, 635–646.
[37] Glinz, M. 2007. On non-functional requirements.
Proceedings of the 15th International Conference on
Requirements Engineering (Delhi, India), 21–26.
[38] Great Britain Office of Government Commerce 2009.
Managing successful projects with PRINCE2. Stationery
Office Books.
[39] Guilford, J.P. 1959. Three faces of intellect. American
Psychologist. 14, 8, 469.
[40] IEEE 1998. IEEE Standard 830-1998: Recommended
Practice for Software Requirements Specifications.
[41] Jacobson, I., Booch, G. and Rumbaugh, J. 1999. The Unified
Software Development Process. Addison-Wesley Longman
Publishing Co., Inc.
[42] Jansson, D.G. and Smith, S.M. 1991. Design fixation. Design
Studies. 12, 1, 3–11.
[43] Karlsson, J. 1996. Software requirements prioritizing.
Proceedings of the Second International Conference on
Requirements Engineering. IEEE. 110–116.
[44] Kruchten, P. 2003. The Rational Unified Process: An
Introduction. Addison-Wesley Professional.
[45] Landis, J.R. and Koch, G.G. 1977. The Measurement of
Observer Agreement for Categorical Data. Biometrics. 33, 1,
159–174.
[46] Levin, I.P. and Gaeth, G.J. 1988. How consumers are
affected by the framing of attribute information before and
after consuming the product. Journal of Consumer Research.
(1988), 374–378.
[47] Levin, I.P., Schneider, S.L. and Gaeth, G.J. 1998. All frames
are not created equal: A typology and critical analysis of
framing effects. Organizational Behavior and Human
Decision Processes. 76, 2, 149–188.
[48] Luchins, A.S. and Luchins, E.H. 1959. Rigidity of behavior:
A variational approach to the effect of Einstellung. Univer.
Oregon Press.
[49] Lujun, Z. 2011. Design fixation and solution quality under
exposure to example solution. Proceedings of the 2nd
International Conference on Computing, Control and
Industrial Engineering, 129–132.
[50] Maher, M., Poon, J. and Boulanger, S. 1995. Formalising
design exploration as co-evolution: A combined gene
approach. Preprints of the Second IFIP WG5.2 Workshop on
Advances in Formal Design Methods for CAD, 1–28.
[51] Maheswaran, D. and Meyers-Levy, J. 1990. The influence of
message framing and issue involvement. Journal of
Marketing Research, 361–367.
[52] Maiden, N. and Gizikis, A. 2001. Where do requirements
come from? IEEE Software. 18, 5 (2001), 10–12.
[53] Maiden, N., Gizikis, A. and Robertson, S. 2004. Provoking
creativity: Imagine what your requirements could be like.
IEEE Software. 21, 5, 68–75.
[54] Maiden, N., Jones, S., Karlsen, K., Neill, R., Zachos, K. and
Milne, A. 2010. Requirements engineering as creative
problem solving: a research agenda for idea finding.
Proceedings of the 18th IEEE International Requirements
Engineering Conference, 57–66.
[55] Maiden, N., Manning, S., Robertson, S. and Greenwood, J.
2004. Integrating creativity workshops into structured
requirements processes. Proceedings of the 5th conference on
Designing interactive systems: processes, practices,
methods, and techniques, 113–122.
[56] Marchant, J., Tjortjis, C. and Turega, M. 2006. A metric of
confidence in requirements gathered from legacy systems:
904
two industrial case studies. Proceedings of the 10th
Conference on Software Maintenance and Reengineering.
IEEE.
[57] McGraw, K. and Harbison, K. 1997. User-centered
requirements: the scenario-based engineering process. L.
Erlbaum Associates Inc.
[58] Moore, D.A. and Healy, P.J. 2008. The trouble with
overconfidence. Psychological Review. 115, 2, 502.
[59] Mumford, M.D. 2003. Where have we been, where are we
going? Taking stock in creativity research. Creativity
Research Journal. 15, 2-3, 107–120.
[60] Mylopoulos, J.C.L. 1992. Representing and Using
Nonfunctional Requirements: A Process-Oriented Approach.
IEEE Transactions on Software Engineering. 18, 6, 483–497.
[61] Nakamura, J. and Csikszentmihalyi, M. 2001. Catalytic
creativity: The case of Linus Pauling. American
Psychologist. 56, 4, 337.
[62] Nguyen, L., Carroll, J. and Swatman, P.A. 2000. Supporting
and monitoring the creativity of IS personnel during the
requirements engineering process. Proceedings of the 33rd
Annual Hawaii International Conference on System
Sciences, IEEE.
[63] Nolan, A., Abrahao, S., Clements, P. and Pickard, A. 2011.
Managing Requirements Uncertainty in Engine Control
Systems Development. Proceedings of the 19th International
Requirements Engineering Conference. IEEE. 259–264.
[64] Nurmuliani, N., Zowghi, D. and Powell, S. 2004. Analysis of
requirements volatility during software development life
cycle. Proceedings of the Australian Software Engineering
Conference, IEEE, 28–37.
[65] Nuseibeh, B., Easterbrook, S. and Russo, A. 2000.
Leveraging inconsistency in software development.
Computer, 33, 4, 24–29.
[66] Oxford English Dictionary 2013. Oxford University Press.
[67] Parsons, J. and Saunders, C. 2004. Cognitive Heuristics in
Software Engineering: Applying and Extending Anchoring
and Adjustment to Artifact Reuse. IEEE Transaction on
Software Engineering. 30, (2004), 873–888.
[68] Perttula, M. and Sipilä, P. 2007. The idea exposure paradigm
in design idea generation. Journal of Engineering Design.
18, 1 (Feb.), 93–102.
[69] Prats, M., Lim, S., Jowers, I., Garner, S.W. and Chase, S.
2009. Transforming shape in design: observations from
studies of sketching. Design Studies. 30, 5, 503–520.
[70] Purcell, A.T. and Gero, J.S. 1996. Design and other types of
fixation. Design Studies. 17, 4, 363–383.
[71] Purcell, A.T. and Gero, J.S. 1992. Effects of examples on the
results of a design activity. Knowledge-Based Systems. 5, 1,
82–91.
[72] Purcell, A.T., Gero, J.S., Edwards, H.M. and Matka, E. 1994.
Design fixation and intelligent design aids. Proceedings of
Artificial Intelligence in Design, 483–495.
[73] Purcell, A.T., Williams, P., Gero, J.S. and Colbron, B. 1993.
Fixation effects: do they exist in design problem solving?
Environment and Planning B. 20, 333–333.
[74] Ralph, P. 2010. Comparing Two Software Design Process
Theories. Global Perspectives on Design Science Research:
Proceedings of the 5th International Conference, DESRIST
2010. R. Winter, J.L. Zhao, and S. Aier, eds. Springer. 139–
153.
[75] Ralph, P. 2012. Improving coverage of design in information
systems education. Proceedings of the 2012 International
Conference on Information Systems. AIS.
[76] Ralph, P. 2011. Introducing an Empirical Model of Design.
Proceedings of The 6th Mediterranean Conference on
Information Systems. AIS.
[77] Ralph, P. 2013. Software Engineering Process Theory: A
Multi-Method Comparison of Sensemaking-Coevolution-
Implementation Theory and Function-Behavior-Structure
Theory. arXiv 1307.1019 [cs.SE].
[78] Ralph, P. 2013. The Illusion of Requirements in Software
Development. Requirements Engineering. 18, 3 (2013), 293–
296.
[79] Ralph, P. The Sensemaking-Coevolution-Implementation
Theory of Software Design. arXiv 1302.4061 [cs.SE].
[80] Ralph, P. 2011. Toward a Theory of Debiasing Software
Development. Research in Systems Analysis and Design:
Models and Methods: 4th SIGSAND/PLAIS EuroSymposium
2011. S. Wrycza, ed. Springer. 92–105.
[81] Reel, J.S. 1999. Critical Success Factors In Software
Projects. IEEE Software. 16, 3, 18–23.
[82] Robertson, J. 2002. Eureka! why analysts should invent
requirements. IEEE Software. 19, 4, 20–22.
[83] Rothman, A.J., Salovey, P., Antone, C., Keough, K. and
Martin, C.D. 1993. The Influence of Message Framing on
Intentions to Perform Health Behaviors. Journal of
Experimental Social Psychology. 29, 5 (Sep.), 408–433.
[84] Runco, M.A. 2008. Commentary: Divergent thinking is not
synonymous with creativity. Psychology of Aesthetics,
Creativity, and the Arts. 2, 2, 93–96.
[85] Schmidt, R., Lyytinen, K., Keil, M. and Cule, P. 2001.
Identifying software project risks: an international Delphi
study. J. Manage. Inf. Syst. 17, 4, 5–36.
[86] Schön, D.A. 1983. The reflective practitioner: how
professionals think in action. Basic Books.
[87] Schwaber, K. 2004. Agile project management with Scrum.
Microsoft Press.
[88] Simon, H.A. 1996. The Sciences of the Artificial. MIT Press.
[89] Snow, A.P., Keil, M. and Wallace, L. 2007. The effects of
optimistic and pessimistic biasing on software project status
reporting. Information & Management. 44, 2, 130–141.
[90] Stacy, W. and MacMillan, J. 1995. Cognitive bias in software
engineering. Communications of the ACM. 38, 6, 57–63.
[91] Stanovich, K. 2009. What Intelligence Tests Miss: The
Psychology of Rational Thought. Yale University Press.
[92] Sternberg, R.J. 2001. What is the common thread of
creativity? Its dialectical relation to intelligence and wisdom.
American Psychologist. 56, 4, 360.
[93] Sutcliffe, A., Thew, S. and Jarvis, P. 2011. Experience with
user-centred requirements engineering. Requirements
Engineering. 16, 4, 267–280.
[94] Suwa, M., Gero, J. and Purcell, T. 2000. Unexpected
discoveries and S-invention of design requirements:
important vehicles for a design process. Design Studies. 21,
6, 539–567.
905
[95] Truong, K.N., Hayes, G.R. and Abowd, G.D. 2006.
Storyboarding: an empirical determination of best practices
and effective guidelines. Proceedings of the 6th ACM
Conference on Designing Interactive Systems (Jun.). ACM.
12–21.
[96] Tversky, A. and Kahneman, D. 1981. The framing of
decisions and the psychology of choice. Science. 211, 453–
458.
[97] Tversky, A. and Kahneman, D. 1985. The framing of
decisions and the psychology of choice. Behavioral decision
making. Springer. 25–41.
[98] van Lamsweerde, A. 2004. Goal-oriented requirements
engineering: a roundtrip from research to practice.
Proceedings of the 12th IEEE International Requirements
Engineering Conference (RE'04). IEEE. 4–7.
[99] Verner, J., Cox, K., Bleistein, S. and Cerpa, N. 2007.
Requirements engineering and software project success: an
industrial survey in Australia and the US. Australasian
Journal of Information Systems. 13, 1.
[100]Yeo, K.T. 2002. Critical failure factors in information system
projects. International Journal of Project Management. 20,
3, 241–246.
[101]Youmans, R.J. 2011. The effects of physical prototyping and
group work on the reduction of design fixation. Design
Studies. 32, 2 (Mar.), 115–138.
[102]Yu, E. 1997. Towards Modelling and Reasoning Support for
Early-Phase Requirements Engineering. Proceedings of the
Third International Symposium on Requirements
Engineering (Jan.). IEEE. 226–235.
[103]Zowghi, D. and Nurmuliani, N. 2002. A study of the impact
of requirements volatility on software project performance.
Proceedings of the Ninth Asia-Pacific Software Engineering
Conference. IEEE. 3–11.
906