PreprintPDF Available
Preprints and early-stage research may not have been peer reviewed yet.

Abstract and Figures

Fast feedback promotes agile teams to improve their work during the software process, making it crucial for team success. Information systems accelerate the availability of information that result in compact knowledge sources. In practice, feedback in Sprints is often limited to sole progress and performance measures, e.g., burndown charts or velocity diagrams. Sprint insights related to team dynamics are rarely considered, even though they frequently cause project failures, e.g., lack of social interaction. In this paper, we describe a survey study conducted with international members of the software engineering community to reveal which information helps agile teams the most and provides practical support in Sprints. We describe results in an experience report highlighting the frequent information problems and needs of agile teams, considering the perspective of 90 researchers and practitioners. The responses were quantitatively interpreted. The report promotes understanding about how or what kind of information would be useful for agile development teams. Moreover, it reveals what information problems were perceived as crucial for project success and avoidable, considering proper team feedback. The study endorses practical needs for system-aided feedback that supplies knowledge on the human factors in Sprints. The findings are relevant for practitioners and researchers that struggle on improving team feedback based on information needs.
Content may be subject to copyright.
Which Information Help agile Teams the Most?
An Experience Report on the Problems and Needs
Fabian Kortum, Jil Kl¨
under, Oliver Karras, Wasja Brunotte and Kurt Schneider
Software Engineering Group
Leibniz University Hannover
30167 Hannover, Germany
{fabian.kortum, jil.kluender, oliver.karras, wasja.brunotte, kurt.schneider}@inf.uni-hannover.de
Abstract—Fast feedback promotes agile teams to improve
their work during the software process, making it crucial for
team success. Information systems accelerate the availability of
information that result in compact knowledge sources.
In practice, feedback in Sprints is often limited to sole progress
and performance measures, e.g., burndown charts or velocity
diagrams. Sprint insights related to team dynamics are rarely
considered, even though they frequently cause project failures,
e.g., lack of social interaction. In this paper, we describe a survey
study conducted with international members of the software
engineering community to reveal which information helps agile
teams the most and provides practical support in Sprints.
We describe results in an experience report highlighting the
frequent information problems and needs of agile teams, consid-
ering the perspective of 90 researchers and practitioners. The
responses were quantitatively interpreted. The report promotes
understanding about how or what kind of information would
be useful for agile development teams. Moreover, it reveals
what information problems were perceived as crucial for project
success and avoidable, considering proper team feedback.
The study endorses practical needs for system-aided feedback
that supplies knowledge on the human factors in Sprints. The
findings are relevant for practitioners and researchers that
struggle on improving team feedback based on information needs.
Index Terms—information needs; experience report; human
factors; sprint feedback; team dynamics
I. INTRODUCTION
Agile development teams strive for fast and continuous
feedback [1], [2]. Information systems that involve automated
data processes promote fast feedback and decision support
in ongoing software projects [3]. When development teams
seek to increase their performances, to improve processes,
or to identify problem causes, the availability of practical
information and feedback is essential [4]. In agile software
development, the considered information diversity is often lim-
ited to objective progress measures with relevance for tracking
teams’ performance, e.g., burndown charts, velocity diagrams,
and estimation errors [5], [6]. Nevertheless, what information
is considered relevant for the development process depends on
the problems and needs of teams and the management [7].
Development performances of teams sometimes vary for
less apparent reasons that depend on the social nature of agile
teams [8]. In retrospectives, knowledge for team dynamics
must be more often reported, discussed, and interpreted with-
out social boundaries to enable reasonable explanations for
causes [9]–[11]. However, the acquisition and interpretation of
soft team factors and related effects during the development
process sometimes lead to fear and lack the trust of team mem-
bers [12]. That is mainly the case when the individual or team
benefit seems marginal or barely recognizable, considering the
big picture, and if individuals worry about adverse effects.
Development teams can benefit from system-aided feedback if
the provided information addresses practical needs and returns
recognizable benefits [3], [13].
In previous studies, we investigated the team dynamics in
software projects and operationalizable concepts to capture,
process, and interpret dependencies in Sprints [13]–[15].
This included a first case study on the practical information
usages and problems in agile development teams from the
industry [16]. For a broader understanding of information
management in teams, we wanted to consider the experiences
encountered by other practitioners and researchers in the past.
The following research questions are the subject of this study.
RQ1: What types of information do agile development
teams practically use before and during Sprints?
RQ2: What feedback information support agile teams the
most due to frequently encountered Sprint problems?
We addressed these questions by conducting a survey
study with a total of 90 researchers and practitioners. The
respondents shared individual experiences with frequent
team problems, encountered information management, and
practiced team feedback in Sprints. The survey data is online
accessible [17].
II. RELATED WORK
Our study is based on related work from data science
and behavioral science, which supports the understanding of
information needs and usages in agile development teams.
Cockburn et al. [18] addressed the need to understand
what factors affect team performances when working with
agile practices. Years later, Azizyan et al. [19] found that,
so far, only little is known about what instruments software
companies use to support their agile methods and whether they
are satisfied or dissatisfied with them. The authors argue the
problem due to a lack of objective surveys on the subject.
306
2020 46th Euromicro Conference on Software Engineering and Advanced Applications (SEAA)
978-1-7281-9532-2/20/$31.00 ©2020 IEEE
DOI 10.1109/SEAA51224.2020.00058
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 24,2020 at 11:51:27 UTC from IEEE Xplore. Restrictions apply.
They performed an independent survey study involving 121
respondents from 120 different companies, with the result that
the most satisfactory tool aspect stated the ease of use. In
contrast, the least satisfactory one is a lack of integration with
other systems.
Misra et al. [20] conducted a survey involving respondents
who practiced agile software development and who had experi-
ence practicing plan-driven software development. The authors
identified factors from the perspective of agile software devel-
opment practitioners on previously experienced or observed in-
fluences for the success of projects, e.g., customer satisfaction,
customer collaboration, customer commitment, decision time,
corporate culture, control, personal characteristics, societal
culture, and training and learning.
Larson et al. [21] presented a review of the future direction
of agile software development, business intelligence, and data
analytics. The authors found that agile principles and practices
have evolved with business intelligence, thus its challenges and
future directions. For example, data acquisitions are possible
considering, e.g., quantitative or qualitative project and team
information tracking in information systems. Automated oper-
ations then find relations, patterns, and discontinuities within
the development records, e.g., using data mining strategies. In
the end, findings get visible through feedback mechanisms that
provide decision-support for future development iterations.
With this, agile practices align well with prescriptive and
predictive analytics [22].
Kortum et al. [13], [14] introduced a system-aided feed-
back concept for JIRA that combines the data capturing,
processing, and characterization of team dynamics in agile
software projects. The authors conducted several case studies
in industrial and academic software projects and identified
behavioral factors that influenced the team performances,
considering objective, subjective measures. The information
was automatically processed within JIRA and provided teams
with fast feedback on actual Sprint dependencies.
III. MOTIVATION FOR A SURVEY STUDY
In our previous work, we have extensively studied the
interactions and information flow in agile teams and their
effects on development performance based on behavioral fac-
tors [23]. Our research project, in cooperation with industrial
partners, resulted in a system-aided feedback concept that is
operationalized as a plug-in for JIRA [13], [14].
However, the feedback information covered in this plug-in
is limited to the experiences we gained during the interviews,
case studies, experiments within a total of 92 academic and
four industrial software projects. Therefore, the actual infor-
mation types that are practically used by the teams, as well as
the feedback need to counteract problems were bound to the
socio-technical environments.
Motivated by related studies [20], [24], we aimed for more
objective views on the usefulness of system-aided feedback
solutions. Therefore, we conducted this survey-based research
to close knowledge gaps and to capture more independent
perspectives from professionals in the domain.
IV. SURVEY STRUCTURE AND FORMATS
In 2019, we promoted the survey at well-known
international conferences, e.g., ICSE, FSE, PROFES, REFSQ,
and RE. The survey design builds on a self-administered
(online) questionnaire that captured personal information
(e.g., respondent’s experiences, opinions, attitudes, and
recommendations), as well as the demographics. We followed
Fowler’s guidance for quantitative surveys [25] and realized
the self-administered questionnaire using LimeSurvey [17].
The study included twelve questions parted into four sections,
considering the information usages, usual needs, frequent
problems, and feedback strategies in agile development teams.
We quantitatively interpreted the responses corresponding to
the twelve questions. The number of participants per question
was annotated with Nand could vary since not all items
were mandatory. All questions were peer-reviewed by the
authors and involved four common answers formats:
Numerical values (e.g., years of experience)
Select-options (e.g., research field)
Yes/No answers (e.g., a polar question)
Ordinal scales (e.g., agreement scale)
All open-ended questions in the survey were analyzed
manually using a qualitative method [26]. Therefore, these
answers were read, classified, and quantitatively interpreted.
We promoted the study with flyers and email-invitation.
V. I NTERPRETATION OF RESPONSES
The overall responses promoted an insightful assessment
that enabled us to characterize the individual opinions and ex-
periences, corresponding to each participant’s demographics.
In the following, the questions of this study are summarized,
and the answers quantitatively analyzed.
A. Demographics of Participants
Q1: asked for the profession of the participants. It was a
closed-ended question that covered usual occupations in the
research and development field of computer science.
In the research field, software engineering dominated by
85%. The practitioners held the most business roles as Devel-
oper (35%), Project Manager (20%), and Software architect
(20%). Some researchers were also a practitioner. A summary
of the participant’s professions and experience background
is shown in Fig. 1. The study involves unintentionally more
researchers than practitioners on a scale of about 2:1.
Q1.1: is a sub-question asking the participants for the
years of experiences in his or her profession. The median
for experience years of researchers was eight years, while the
practitioners replied with a median of 5 years. In total, N=85
responses were considered.
Q2: asked the participants for their workplace location.
A drop-down list offered selection options of 194 countries.
Figure 2 summarizes the total number of participants per
country. Germany was a very dominating location with most
participants, explainable due to the origin of the authors.
307
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 24,2020 at 11:51:27 UTC from IEEE Xplore. Restrictions apply.
Fig. 1. Research Fields and Business Roles of Participants (N=85) - Q1
B. Team-related Problems and Avoidable Risks
Q3: held a multi-selection format, at which the respon-
dents were asked to select the team-related issues that they
repeatedly encountered during Sprints. Alternative problems
could have been added manually. Each selection resulted in
two additional sub-questions.
Q3.1: resulted in a list of selected team issues from Q3. For
each entry, the participants were asked whether they believe
that the problem was avoidable by proper team feedback.
Q3.2: resulted in a list of selected team issues from Q3. For
each entry, the participants were asked whether they believe
that the problem had risked the Sprint’s success.
Fig. 2. Workplace Countries of Participant’s (N=85) - Q2
The relative number of encountered team-related problems
and the relative perceptions for problems that have been
avoidable or risked the Sprint success are summarized in
Fig. 3. The responses N=90revealed that the five most
encountered team issues in Sprints included lacking communi-
cation, lacking quality thinking, lacking skills, misperceptions,
and withhold information, which were also perceived to be
avoidable through proper team feedback by an average of 65%.
Furthermore, 63% of respondents believed these five issues
also risked Sprints’ success.
Fig. 3. Team-related Issues Encountered in Sprints (N=90) - Q3
308
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 24,2020 at 11:51:27 UTC from IEEE Xplore. Restrictions apply.
Fig. 4. Information Usages in Teams during Sprint Planning (N=80) - Q4
C. Information Usages of agile Teams in Sprints
Q4: asked the participants how often they have encountered
that the listed information in Fig. 4 were used for Sprint
planning. Up to three additional information types could have
been manually added to the list. All information types with
frequencies ”never” or ”rarely” resulted in a sub-question.
Q4.1: asked the participants for an assumption about why
they think that the selected information is never/rarely used.
The perceived use of different information types during
Sprint planning is shown in Fig. 4. Information that relies on
computer-aided Sprint forecasts, team communication reviews,
documented Sprint benefits, also team cohesion and atmo-
sphere revealed usage rates <30%. For example, the rare usage
of computer-aided Sprint forecasts during Sprint planning was
repetitively reasoned with ”lack of tools” or ”no knowledge
about it”, as well as ”that it requires too much effort”, or ”there
is no time for that”. The descriptive responses were manually
reviewed, classified, and quantitatively interpreted. Figure 5
shows the eight classified reasons; each was mentioned at least
four times by different respondents. Other responses were too
unique or generic to be classified. However, all raw responses
can also be found in the survey data [17].
Fig. 5. Reasons of few Information Usages in Sprint Planning (N=41) - Q4.1
Q5: holds a list of teams-related problems with the focus on
Sprint retrospectives. The participants were asked to rate how
often they had experienced that developer shared, reported, or
discussed team-related issues in Sprint retrospectives.
The frequency of team-related problem reports in Sprint
retrospectives is summarized in Fig. 6. For example, the task-
based problems were ”Always/Usually” thematized in 72% of
the Sprint retrospectives or reviews. This is a trivial outcome
considering the central relevance of tasks in agile software
development. However, the result seems very positive, consid-
ering that only 6% addressed this problem ”rarely” or ”never”.
In contrast, it can also be interpreted that the observed teams
have less task-based problems to report about, implying that
they do a good job. However, given the N=69participants,
some objective trends can be found within the responses.
In the following, we consider only those problems that were
quantitatively more often rated with ”rarely” and ”never”,
than the number of the combined usage of ”sometimes”,
”usually” and ”always”. In the case of Fig. 6, this reveals
burnout situation, defense of (dysfunctional)-habits and social-
driven conflicts as most problematic factors. These three
aspects present central factors in agile practices, which build
on people-centered software development [18]. It is highly
desirable to support the problem sharing towards those teams,
and managers should increase the awareness for ongoing
situations in teams without pointing out individuals.
D. Team and Sprint Aspects for Feedback
Q6: asked the participants about how often they think it is
best to provide team feedback. The multiple-choice options
are shown in Fig 7.
Based on the participants’ opinions, ”on-demand” and ”pre-
scheduled” feedback times resulted in the highest affirmation.
Both are standard agile practices, either in Sprint events
such as reviews, or, e.g., in dailies/weeklies, when teams
can address problems. This does not relativize the 19% of
309
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 24,2020 at 11:51:27 UTC from IEEE Xplore. Restrictions apply.
Fig. 6. Problem Sharing/Reporting in Teams during Sprint Retrospectives or Reviews (N=69) - Q5
all respondents that voted towards more team-feedback by
”permanent concepts”, e.g., through real-time dashboards. The
responses indicate that team-feedback should be available
whenever needed. The reactions are essential for system-aided
feedback solutions since they grant valuable insights towards
how often team feedback is expected in practice.
Fig. 7. Rating on Team-feedback Frequency (N=74) - Q6
Q8: held two statements, which the respondents either
affirmed or rejected corresponding to their experiences.
”Teams and Managers often lack Awareness or Experiences
about the Human Factors in Software Projects.59% of the
participants affirmed the statement (N= 69). Only 13% of the
respondents disagreed with the statement, while 28% replied
that they are uncertain.
”Understanding the Team Behavior in past Sprints is Cru-
cial for Improvements in next Sprint Planning.86% of the
participants affirmed the statement (N= 69). Only 4% of the
respondents disagreed with the statement, while 10% replied
that they are uncertain.
We involved these two statements in the survey because we
wanted to identify whether there are dependencies between the
previous results of Q4and Q5. Risks for Sprint success, based
on awareness lacking for the human factors in agile software
development, can be avoided by the teams and managers.
Lacking tools or the complexity of such should not be the
main reason why social-driven problems and team behaviors
become barely thematized during Sprint retrospectives.
Q9: asked the respondents about whether they think that
shared status information in exchange for team improvement
opportunities is reasonable.
The responses to this question have a truly subjective
nature but provide deep insights towards the participants’
split perspectives for this sensitive topic. For 46% of the
participants (N= 68), it is legit to share status information
for improvement chances. In contrast, 53% were uncertain,
and only 1% rejected this as a sufficient reason. However,
improvements build on adequate information, and so does
team-feedback. Developers must provide a willingness and
motivation to support insights to enable a better understanding
of improvement options.
Q10: asked the participants to select five relevant system-
aided feedback information from a predefined list, according
to what they believe to be the most helpful information for
agile teams. The survey provided an interactive tool-tip to
310
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 24,2020 at 11:51:27 UTC from IEEE Xplore. Restrictions apply.
Fig. 8. Information Rating for System-Aided Feedback Support in agile Teams (N=65) - Q10
ensure the proper understanding of the information options.
The ten predefined feedback aspects build on related work
and active researches [14], [15], [27]. The rating results are
shown in Fig. 8 and highlight which feedback information
was considered most supportive of agile teams. Each aspect
could have theoretically reached a maximum of 100%, which
would require that all participants have voted with one of
their five possible choices for this particular item. Double
selections were not possible. The average line in red marks the
50% border, while the yellow lines characterize the calculated
standard deviation of σ=.2based on the distribution of the
responses. We clustered the relevance ratings as follows.
Group 1: The system-aided feedback on team balance,
and workload (76%), and problems and conflicts (72%) were
categorized as highly relevant to support teams with feedback.
Group 2: The system-aided feedback on customer reflec-
tion (68%),team communication and structures (65%), project
manager reflection (59%), team atmosphere (53%) were cate-
gorized as relevant for teams since the relevance ratings were
above the average, but not above the σ=.2boundary.
Group 3: The system-aided feedback on Sprint planning
advise (35%), interdependencies in Sprints (34%) was cate-
gorized as less relevant for teams since the result was in the
lower average range but above the minimum σ-boundary.
Group 4: The system-aided feedback on vulnerabilities in
code commits (24%) and time-series forecasts (15%) was
categorized as not relevant for teams, since the result was in
the lower average and outside the minimum σ=0.2range.
The following two questions were added in the survey to
receive some first feedback on the acceptance and thoughts
on the JIRA solution. In the last section of this survey,
we introduced the functional features of our operationalized
system-aided feedback solution for JIRA [13], [14], [28]. An
animated overview of the information support was provided
in the survey.
Q11: asked the participants about their general acceptance
for the introduced system-aided feedback concept. The re-
sponses were given on a closed-ended answer format with
five options. The responses of the participants (N= 68) are
shown in Fig. 9.
Fig. 9. Acceptance of Participants for System-aided Feedback (N=68)-Q11
Q12: asked whether they think it is helpful to involve both
retrospective Sprint information and future implication within
our system-aided feedback concept. A total of 84% responses
(N=68) stated ”yes”, while 16% replied ”no”.
VI. DISCUSSION OF RESULTS
With the help of this survey study, we captured opinions and
experiences about the information used in agile development
teams from a total of 90 researchers and practitioners with
computer science professions. Different questions reached a
varying number of participants because some questions were
not mandatory or because the respondents could not answer
the subject as well for other unspecified reasons. However, the
quantitative interpretation of each questions’ responses was
made with varying N.
311
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 24,2020 at 11:51:27 UTC from IEEE Xplore. Restrictions apply.
Q1and Q2summarized the participation variance according
to the professions in computer science, workplace location,
and the years of experience each respondent has. Figure 1
revealed that the years of experience between the participat-
ing researchers and practitioners close results. The fact that
30% of all participants were from Germany and that nearly
22% more researchers than practitioners participated in this
survey might have biased independence of the overall study
outcome. However, we did not further distinguish the different
demographics in this paper.
Q3revealed that the most team-related problems in Sprints
refer to lacking communication, withhold information, and
misperceptions in organizations. The replies also showed that
team feedback was assumed to reduce Sprint failures’ risks,
thus providing more information transparency in projects
without harming individuals’ privacy. Surprisingly, only 51%
of the participants encountered that teams involve past Sprint
performance information for the next Sprint planning. That is
an intriguing outcome, since the common information sources,
e.g., burndown-charts or velocities, are usually available from
previous Sprints.
Q4and Q5questioned the frequency of used informa-
tion types by teams during Sprint Planning. The respondents
replied, e.g., that most of the observed agile teams did not
use or considered system-aided Sprint forecasts for their
planning because of lacking tools, and primarily the time for
approaching such techniques. Most respondents uttered that
these technical solutions often do not add value or are too
inaccurate to be considered during the development process.
The responses from Q3to Q5enabled us to answer RQ1
about the experiences made or observed related to information
usages in teams. The responses showed that agile development
teams mostly work against time. This also reflects in the
information selection and their uses during Sprint planning,
retrospective, and reviews. Simple objective information, with-
out costs for time or effort, find the most consideration in
teams.
Q6to Q12 addressed the aspects related to (system-aided)
feedback in practice, and what information types are needed
most to support teams during Sprint retrospectives and re-
views. The responses revealed that most technical issues,
such as task-based problems, unsatisfactory test results, or
insufficient quality, were frequently addressed in Sprint ret-
rospectives and reviews. All three aspects have in common
that they build on objective measures. In contrast, most infor-
mation based on subjective data, such as dysfunctional habits,
social-driven conflicts, and burnout-situation, were often not
sufficiently thematized in software projects.
The responses of Q6to Q12 enabled us to answer RQ2
about what feedback information supports agile teams the
most by solving the frequently arising problems in Sprints. The
responses pointed out that additionally provided team feedback
should involve information on issues, conflicts, teamwork,
workload balances, communication structures as well of cus-
tomer and manager reflections. These seven factors presented
the highest demand, according to the participants.
VII. THREATS TO VALIDITY
Our survey is an instrument for measuring the experiences
of practitioners and researchers covering the information us-
ages of agile development teams in Sprints [7]. The study is
subject to some threats to validity [29].
Construct validity: We collected, analyzed, and interpreted
participants’ experiences solely by their subjective responses.
No analyses were performed regarding external references or
comparable studies with survey measures. Also, no reviews
for determining the statistical significance were done. The
interpretations of the responses for all questions are based on
quantitative measures. Due to the human nature of reading
scientific data, interpretation gaps might be possible.
Content validity: The authors subjectively assessed the
survey regarding how appropriate the instrument seems, and,
with certain domain knowledge on team factors in agile
software development, also the design of questionnaires. The
survey-based research, the question selection, and the answer
formats are based on our related work in Sec. II [14], [25],
[30]. Remarks of all authors were iteratively adopted, followed
by post-reviews. Therefore, there are no objective measures
for the survey validity, but the pre-assessments provide, in
our opinion, a solid foundation. However, we are aware that
many questions based on closed-ended answer formats with
predefined response options that may be flawed or biased the
objectivity of the finding. Nevertheless, participants almost
always had the chance to add individual options as well.
Criterion validity: The responses enabled the identification
of respondents belonging to different groups without elicit-
ing identities or harming the anonymity of their responses.
However, we did not differentiate the answers according to
the participant’s demographics, and holistically interpreted
all responses. Some questions had sub-questions, e.g., when
”communication” was selected as the most frequent problem
in teams, the sub-question result in ”how relevant is the item
for project success based on the respondent’s experiences.
Additional sub-questions allowed descriptive responses, which
required manual reviews, the classification of keywords fol-
lowed by quantification. This subjective process depends on
the reviewer’s skills and domain knowledge. A replication of
the study with the same or other reviewers might lead to a
different quantification of descriptive responses. Also, some
research participants might not have had the experience to
report team problems and needs towards that some subjects
did not answer the survey altogether.
VIII. SUMMARY
In this study, we introduced a self-administered online ques-
tionnaire that captured the subjective opinions and expertise
of 90 practitioners and researchers. This study’s objectives
were to provide an experience report about how experts from
different computer science professions and global workplaces
perceive information management in organizations.
The participants’ responses resulted in a quantitative conclu-
sion covering the constant information needs and problems in
teams. The study included descriptive replies and assumptions
312
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 24,2020 at 11:51:27 UTC from IEEE Xplore. Restrictions apply.
towards what information helps agile teams the most in Sprints
and provides practical support to improve team feedback.
Besides, the overall quantitative results revealed that 69% of
the stated information-related team problems were also crucial
for project success. Also, 78% of the participants believed that
the encountered team problem could have been avoided with
proper feedback.
Given the researchers’ and practitioners’ responses, we
identified several additional information sources that are fre-
quently used during Sprint planning, retrospectives, and re-
views. We also found which team-related issue arises most
repetitively in Sprints, and which information in agile teams
usually consider addressing the problems.
Another key outcome was that the responses enabled us to
conclude the overall acceptance for our system-aided feedback
concept with 84%. With the focus on future feedback-support,
the open mentality for system-aided solution shown in this
study enables researchers to work on more practical solutions
to supply teams with additional knowledge on Sprint and team
performances. Besides, the reactions highlighted several other
problems and demands related to feedback mentality. These
and other aspects will find consideration in our future work
towards better team support.
ACKNOWLEDGMENT
This work was funded by the German Research Society
(DFG) under the project name Team Dynamics (2018-2020).
Grant number 263807701.
REFERENCES
[1] A. Vetr`
o, S. Ognawala, D. M. Fern´
andez, and S. Wagner, “Fast feedback
cycles in empirical software engineering research,” in Proceedings of
the 37th International Conference on Software Engineering-Volume 2.
IEEE Press, 2015, pp. 583–586.
[2] L. Williams and A. Cockburn, “Agile software development: it’s about
feedback and change,” IEEE Computer, vol. 36, no. 6, pp. 39–43, 2003.
[3] A. Vetro, R. D¨
urre, M. Conoscenti, D. M. Fern´
andez, and M. Jørgensen,
“Combining data analytics with team feedback to improve the estimation
process in agile software development,Foundations of Computing and
Decision Sciences, vol. 43, no. 4, pp. 305–334, 2018.
[4] D. B. Walz, J. J. Elam, and B. Curtis, “Inside a software design team:
knowledge acquisition, sharing, and integration,Communications of the
ACM, vol. 36, no. 10, pp. 63–77, 1993.
[5] S. Downey and J. Sutherland, “Scrum metrics for hyperproductive teams:
how they fly like fighter aircraft,” in 2013 46th Hawaii International
Conference on System Sciences. IEEE, 2013, pp. 4870–4878.
[6] N. Oza and M. Korkala, “Lessons learned in implementing agile
software development metrics.” in UKAIS, 2012, p. 38.
[7] M. Drury, K. Conboy, and K. Power, “Obstacles to decision making
in agile software development teams,Journal of Systems and
Software, vol. 85, no. 6, pp. 1239 – 1254, 2012, special Issue:
Agile Development. [Online]. Available: http://www.sciencedirect.com/
science/article/pii/S0164121212000374
[8] E. Whitworth and R. Biddle, “The social nature of agile teams,” in Agile
2007 (AGILE 2007). IEEE, 2007, pp. 26–36.
[12] M. Hertzum, “The importance of trust in software engineers’
assessment and choice of information sources,” Information and
Organization, vol. 12, no. 1, pp. 1 – 18, 2002. [Online]. Available:
http://www.sciencedirect.com/science/article/pii/S1471772701000070
[9] T. Chau and F. Maurer, “Knowledge sharing in agile software teams,”
in Logic versus approximation. Springer, 2004, pp. 173–183.
[10] J. Pfeffer, Grundlagen der agilen Produktentwicklung: Basiswissen zu
Scrum, Kanban, Lean Development. BoD–Books on Demand, 2019.
[11] E. Derby, D. Larsen, and K. Schwaber, Agile retrospectives: Making
good teams great. Pragmatic Bookshelf, 2006.
[13] F. Kortum, J. Kl¨
under, and K. Schneider, “Behavior-driven dynamics
in agile development: The effect of fast feedback on teams,” in 2019
IEEE/ACM International Conference on Software and System Processes
(ICSSP), May 2019, pp. 34–43.
[14] F. Kortum, O. Karras, J. Kl¨
under, and K. Schneider, “Towards
a better understanding of team-driven dynamics in agile software
projects,” in Product-Focused Software Process Improvement, X. Franch,
T. M ¨
annist¨
o, and S. Mart´
ınez-Fern´
andez, Eds. Cham: Springer Inter-
national Publishing, 2019, pp. 725–740.
[15] S. Dorairaj, J. Noble, and P. Malik, “Understanding team dynamics in
distributed agile software development,” in Agile Processes in Software
Engineering and Extreme Programming, C. Wohlin, Ed. Berlin,
Heidelberg: Springer Berlin Heidelberg, 2012, pp. 47–61.
[16] J. Kl ¨
under, F. Kortum, T. Ziehm, and K. Schneider, “Helping teams to
help themselves: An industrial case study on interdependencies during
sprints,” in Human-Centered Software Engineering.Cham: Springer
International Publishing, 2019, pp. 31–50.
[17] F. Kortum, “An Experiences Survey about Sprint Feedback for Teams,
Dec. 2019. [Online]. Available: https://doi.org/10.5281/zenodo.3629843
[18] A. Cockburn and J. Highsmith, “Agile software development, the people
factor,Computer, vol. 34, no. 11, pp. 131–133, 2001.
[19] G. Azizyan, M. K. Magarian, and M. Kajko-Matsson, “Survey of agile
tool usage and needs,” in 2011 Agile Conference, Aug 2011, pp. 29–38.
[20] S. C. Misra, V. Kumar, and U. Kumar, “Identifying some important
success factors in adopting agile software development practices,
Journal of Systems and Software, vol. 82, no. 11, pp. 1869 – 1890,
2009, sI: TAIC PART 2007 and MUTATION 2007. [Online]. Available:
http://www.sciencedirect.com/science/article/pii/S016412120900123X
[21] D. Larson and V. Chang, “A review and future direction of
agile, business intelligence, analytics and data science,” International
Journal of Information Management, vol. 36, no. 5, pp. 700 – 710,
2016. [Online]. Available: http://www.sciencedirect.com/science/article/
pii/S026840121630233X
[22] S. W. Ambler and M. Lines, “The disciplined agile process decision
framework,” in Software Quality. The Future of Systems- and Software
Development, D. Winkler, S. Biffl, and J. Bergsmann, Eds. Cham:
Springer International Publishing, 2016, pp. 3–14.
[23] K. Schneider, O. Liskin, H. Paulsen, and S. Kauffeld, “Media, mood, and
meetings: Related to project success?” ACM Transactions on Computing
Education (TOCE), vol. 15, no. 4, p. 21, 2015.
[24] K. Kuusinen, T. Mikkonen, and S. Pakarinen, “Agile user experience
development in a large software organization: Good expertise but limited
impact,” 10 2012, pp. 94–111.
[25] F. J. Fowler Jr, Survey research methods. Sage publications, 2013.
[26] J. Lazar, J. H. Feng, and H. Hochheiser, Research methods in human-
computer interaction. Morgan Kaufmann, 2017.
[27] C. Stettina and W. Heijstek, “Five agile factors: Helping self-
management to self-reflect,” vol. 172, 06 2011, pp. 84–96.
[28] F. Kortum, J. Kl¨
under, W. Brunotte, and K. Schneider, “Sprint perfor-
mance forecasts in agile software development - the effect of futurespec-
tives on team-driven dynamics,” in Proceedings of the 31st International
Conference on Software Engineering and Knowledge Engineering. KSI
Research Inc. and Knowledge Systems Institute Graduate School, jul
2019.
[29] R. Whittemore, S. K. Chase, and C. L. Mandle, “Validity in qualitative
research,” Qualitative health research, vol. 11, no. 4, pp. 522–537, 2001.
[30] B. A. Kitchenham and S. L. Pfleeger, Personal Opinion Surveys.
London: Springer London, 2008, pp. 63–92. [Online]. Available:
https://doi.org/10.1007/978-1-84800-044- 5 3
313
Authorized licensed use limited to: Technische Informationsbibliothek (TIB). Downloaded on December 24,2020 at 11:51:27 UTC from IEEE Xplore. Restrictions apply.
ResearchGate has not been able to resolve any citations for this publication.
Chapter
Full-text available
In agile software development, proper team structures and sprint estimations are crucial aspects to reach high-performance outcomes. Performance can vary due to the influence of social-driven team factors. Resulting in team dynamics with the focus on human factors are usually difficult to capture and thus often not monitored. However, their impact can impede the planning and fulfillment of sprints. Data on team behavior should be simplified to track, analyze, and interpret as sprint influences are important to understand. We provide a centralized solution that extends JIRA functionally and continuously captures sprint characteristics in the daily working environment of teams. In this paper, we describe a JIRA plugin that enables the assessment of team behavior in combination with exploratory analyses. The tool became approached with six software projects and a total of 53 undergraduate students. Characterizations made with the plugin can reveal sprint and team dynamics over time, involving development performance and team-related measures. The feature comes with a feedback mechanism for teams that visualize and implicates the sprint dependencies. The approach reveals a set of team-related sprint dynamics, its systematically capturing, and characterization. With the achieved solution, team leader and developer can be supported to understand the ongoing sprint and team-driven dynamics better. Thus, they can keep track of their habits for future sprint planning and team adjustment impacts.
Conference Paper
Full-text available
Agile software development teams strive for fast and continuous feedback. Both the quality of the resulting software and the performance of the team require feedback. The performance of the team developments is often addressed in retrospectives which are not only part of the SCRUM framework, but also in general. Reflecting on incidents during the last sprint helps the team to increase performances, expressed by, e.g., efficiency and productivity. However, it is not only essential to identify volatile sprint performances, but also to characterize the primary cause to solve them. Main reasons for low performance are often not visible, primarily when they are related to socialdriven team behavior, such as communication structures, mood, or satisfaction. In this paper, we analyze whether automated team feedback about retrospective sprint-behavior can help the team to increase performances due to additional awareness about the dynamic effects over time. In a comparative case study with 15 software projects and a total of 130 undergraduate students, we investigated the sustainable impact of feedback on human aspects. Our results indicate that automated feedback positively affects team performances – and customer satisfaction.
Article
Full-text available
We apply a mixed research method to improve the user stories estimation process in a German company following agile software development. We combine software project data analytics with elicitation of teams’ feedback, identify root causes for wrong estimates and propose an improved version of the estimation process. Three major changes are adopted in the new process: a shorter non numerical scale for story points, an analogy-based estimation process, and retrospectives analyses on the accuracy of previous sprints estimates. The new estimation process is applied on a new project, and an improvement of estimates accuracy from 10% to 45% is observed.
Article
Full-text available
Agile methodologies were introduced in 2001. Since this time, practitioners have applied Agile methodologies to many delivery disciplines. This article explores the application of Agile methodologies and principles to business intelligence delivery and how Agile has changed with the evolution of business intelligence. Business intelligence has evolved because the amount of data generated through the internet and smart devices has grown exponentially altering how organizations and individuals use information. The practice of business intelligence delivery with an Agile methodology has matured; however, business intelligence has evolved altering the use of Agile principles and practices. The Big Data phenomenon, the volume, variety, and velocity of data, has impacted business intelligence and the use of information. New trends such as fast analytics and data science have emerged as part of business intelligence. This paper addresses how Agile principles and practices have evolved with business intelligence, as well as its challenges and future directions.
Conference Paper
Full-text available
Background/Context: Gathering empirical knowledge is a time consuming task and the results from empirical studies often are soon outdated by new technological solutions. As a result, the impact of empirical results on software engineering practice is often not guaranteed. Objective/Aim: In this paper, we summarize the ongoing discussion on "Empirical Software Engineering 2.0" as a way to improve the impact of empirical results on indus- trial practices. We propose a way to combine data mining and analysis with domain knowledge to enable fast feedback cycles between researchers and practitioners. Method: We identify the key concepts on gathering fast feedback in empirical software engineering by following an experience-based line of reasoning by argument. Based on the identified key concepts, we design and execute a small proof of concept with a company, to demonstrate potential benefits of the approach. Results: In our example we observed that a simple double feedback mechanism notably increased the precision of the data analysis and improved the quality of the knowledge gathered. Conclusion: Our results serve as a basis to foster discus- sion and collaboration within the research community for a development of the idea
Conference Paper
Full-text available
While Agile methods were originally introduced for small, tightly coupled teams, leaner ways of working are becoming a practical method to run entire enterprises. As the emphasis of user experience work has inherently been on the early phases before starting the development, it also needs to be adapted to the Agile way of working. To improve the current practices in Agile user experience work, we determined the present state of a multi-continental software development organization that already had a functioning user experience team. In this paper, we describe the most prevalent issues regarding the interaction of user experience design and software development activities, and suggest improvements to fix those. Most of the observed problems were related to communication issues and to the service mode of the user experience team. The user experience team was operating between management and development organizations trying to adapt to the dissimilar practices of both the disciplines.
Chapter
Software process improvement is a very important topic. Almost all companies and organizations face the necessity for improvement sooner or later. Sometimes, there is obvious potential for improvement (e.g., if the number of developers does not fit the project size). Nonetheless, fixing all obvious issues does not necessarily lead to a “perfect” project. There are a lot of interdependencies between project parameters that are difficult to detect – sometimes due to the influences of social aspects which can be hardly grasped.
Conference Paper
The Disciplined Agile 2.0 process decision framework [1] provides light-weight guidance to help organizations streamline their information technology (IT) processes in a context-sensitive manner. It does this by showing how various activities such as solution delivery, operations, enterprise architecture, portfolio management, and many others work together in a cohesive whole. The framework also describes what these activities should address, provides a range of options for doing so, and describes the tradeoffs associated with each option. Every person, every team, and every organization is unique, therefore process frameworks must provide choices, not prescribe answers.
Article
Set the Stage Lay the groundwork for the session by reviewing the goal and agenda. Create an environment for participation by checking in and establishing working agreements. Gather Data Review objective and subjective information to create a shared picture. Bring in each person's perspective. When the group sees the iteration from many points of view, they'll have greater insight. Generate Insights Step back and look at the picture the team created. Use activities that help people think together to delve beneath the surface. Decide What to Do Prioritize the team's insights and choose a few improvements or experiments that will make a difference for the team. Close the Retrospective Summarize how the team will follow up on plans and commitments. Thank team members for their hard work. Conduct a little retrospective on the retrospective, so you can improve too.