Book

Extreme Programming Explained. Embrace Change

Authors:
... Process (Kruchten, 2003) and strictly phased waterfall method (Royce, 1970) to highly adaptive agile methods like eXtreme Programming (Beck and Andres, 2004), SCRUM (Schwaber and Beedle, 2001), or Chrystal (Cockburn, 2004). Agile methods demand to get continuous user feedback during short design-implement-test-deliver iterations. ...
... Like scientific papers, source code quality is improved by reviews. The most intensive form of code reviewing is pair programming, where one developer continually monitors another developer that is entering the code (Beck and Andres, 2004). A large number of defects and small-scale design flaws are intercepted this way before they become part of the standing code base. ...
... At the same time we gathered feedback creating new story cards, a short formulation of a user requirement (Beck and Andres, 2004), to be implemented in a following iteration ( Figure 2-6). A story card was implemented vertically through all architectural layers in contrast to the more classical approach of developing layer-by-layer horizontally. ...
... User stories have their origins in extreme programming (XP). Kent Beck, the founder of XP, stated that user stories were created to address the specific needs of software development, conducted by small teams in the face of changing and vague requirements [1]. ...
... In general, user stories consist of brief descriptions of a system feature written from the perspective of the customer who wants the system [2]. Different user story templates and practices have been proposed by Beck [1], Jeffries, et al. [3], Beck and Fowler [4], and Cohn [2,5]. ...
... The authors had an agreement on how to assess completeness, correctness, verifiability, and traceability. We used this rating for completeness: 1 if the issue was well covered, 1 2 if it was mentioned or had a partial solution, and 0 otherwise. Issue A11 (Table A1, in Appendix A) provides an example: "There are 10-15 employees who occasionally or full time serve as supporters. ...
Article
Full-text available
(1) Background: User stories are widely used in Agile development as requirements. However, few studies have assessed the quality of user stories in practice. (2) Methods: What is the quality of user stories in practice? To answer the research question, we conducted a case study. We used an analysis report of a real-life project where an organization wanted to improve its existing hotline system or acquire a new one. We invited IT practitioners to write requirements for the new system based on the analysis report, user stories, and whatever else they considered necessary. The practitioners could ask the authors questions as they would ask a customer in a real setting. We evaluated the practitioners’ replies using these IEEE 830 quality criteria: completeness, correctness, verifiability, and traceability. (3) Results: The replies covered 33% of the needs and wishes in the report. Further, the replies largely missed other requirements needed in most projects, such as learnability and maintainability. Incorrect or restrictive solutions were often proposed by the practitioners. Most replies included user stories that were hard to verify, or would have caused a cumbersome user interface if implemented independently. (4) Conclusion: Implications for practitioners and researchers in the field are discussed in the study.
... This software development paradigm has permeated greatly in both small-medium and large organizations (Hoda, Salleh, and Grundy 2018), and nowadays it co-exists with traditional rigor-oriented development paradigms (Ahimbisibwe, Cavana, and Daellenbach 2015;Batra et al. 2010;Boehm and Turner 2005). Several agile software development methodologies have been proposed (Abrahamsson, Oza, and Siponen 2010) such as Scrum (Schwaber and Sutherland 2017), Extreme Programming, best known as XP (Beck 1999;Beck and Andres 2004), and Crystal (Cockburn 2004). Two relevant studies (Qumer andHenderson-Sellers 2008a, 2008b) evaluated the adherence to the Agile Manifest components (i.e. ...
... Regarding specific agile design tenets from the Agile Manifesto (Highsmith and Cockburn 2001) and the Scrum (Schwaber and Sutherland 2017) and XP (Beck 1999;Beck and Andres 2004) methodologies, no relevant studies have been located. Thus, this core literature was analyzed to identify the agile tenets addressing the design issues. ...
... These three studies were considered an essential part of the components of design. We also added to the set of components of design the agile attributes from the studies of Conboy and Fitzgerald (2004), Qumer and Henderson-Sellers (2008a);2008b), and Conboy (2009), and the agile design principles from the agile Software Engineering literature (Beck 1999;Beck and Andres 2004;Highsmith and Cockburn 2001;Schwaber and Sutherland 2017). Table 5 presents Integrative Agile ITSM Framework of Tenets and Practices, resulting from a summarized adaptation from the tenets (aim, values, attributes, and principles) and the practices of the initial proposals for the agile ITSM studies from Verlaine (2017) and Mora et al. (2019Mora et al. ( 2021; with the addition of the agile attributes from the studies of Conboy and Fitzgerald (2004), Qumer and Henderson-Sellers (2008a); 2008b), and Conboy (2009), and the agile design principles from the agile Software Engineering literature (Beck 1999;Beck and Andres 2004;Highsmith and Cockburn 2001;Schwaber and Sutherland 2017). ...
... User Story as a Driver -In order to achieve betterwritten requirements, we will focus attention on a driver that advances this quality value, the User Story (US) method [14]. The source of US is in the Extreme Programming (XP) methodology [15]. The US became the most common method to handling Agile projects' requirements, and their use has been adopted in many books about agile development [16] [17]. ...
Article
Full-text available
The term "software quality" is widely used, and although it has many definitions, no one definition is universally accepted. Often, the definition refers to specific phases of the software development process and not to software as a whole. In our article, in order to improve the quality of the software, we decided to improve the quality of the phases. To do that, we will focus attention on the drivers that advance quality value and use a new concept called QVD - quality value drivers. We focused on the requirements phase of software development, which probably ranks as the crucial first step. Hence, as a QVD we present the idea of "User Story" (US), a short and simple description of a functionality valuable to a user of a system. In the study, a comparison is made between requirements written by the US method and requirements that are not. After analyzing the results, we concluded that requirements written by the US method have been better understood and evaluated as less difficult to develop. The overall quality rating of their writing is higher than previously. In addition, learning the US method gives the person the ability to better assess the quality of requirements. Finally, improving the writing of the requirements using the US method as QVD improves the quality of the development process, eventually improving the software quality.
... In addition, the proposed aspects show their relationship between agile values and principles. On the other hand, the aspects proposed in AgilityRef arise from the identification of the relationship between elements of agile approaches as well as Scrum [35], XP [36], and Kanban [37], widely used by the software industry according to the state of agile report [38] and agile principles. ...
Article
Full-text available
Currently, there is a broad range of software development agile approaches mainly based on the values and principles defined in the agile manifesto (AM); however, in many cases, their implementation is carried out informally and without being aligned with the values and principles stated. In practice, practitioners and consultants may lose sight of the AM recommendations, which could jeopardize the companies’ agility; therefore, applying an agile approach does not make a company agile. In this article, we present a reference model called AgilityRef, which allows practitioners to support the understanding and implementation of the values and principles of the agile manifesto in the software development processes of a company, this, through twenty-two aspects defined from relations established between principles and values described in the AM and processes elements described in Scrum, XP and Kanban. The evaluation of AgilityRef was carried out through a focus group where its completeness, understandability and suitability were evaluated. Our findings suggest good enough acceptance by professionals and consultants who evaluated the proposal. The proposed reference model seems to allow professionals and companies to improve the understanding and implementation in practice of the concept of agility in their companies’ software development processes, thus, minimizing the subjectivity and error of their process adoption, implementation, evaluation and articulation with the principles and values of the agile manifesto.
... Bellucci et al. (2015) examined in a field study, how users interact with prototypes. They combined methods of Extreme programming (Beck and Andres, 2004) with co-design sessions to develop the product with strong user involvement. The case study by Kautz (2010) examines how stakeholders are involved in practice within agile product development using participatory design. ...
Conference Paper
Full-text available
The integration of Human-Centered Design into agile product development can be challenging. In particular, the application of established user research and UX design methods within short feedback cycles frequently leads to discussions. This article provides an overview of the development and current state of research in Agile UX. In particular, we analyse Lean UX and answer the following questions: How can user research be implemented in agile product development and which best practices can be used to achieve it? For this purpose, we conducted a literature review and analysed how user research in agile product development has progressed in recent years. We discuss an approach that leverages identified best practices for Agile UX by introducing a new model for integrating Lean UX with Scrum to address the needs of the users more strongly in agile product development. We conclude that existing best practices and patterns already aim to adapt established user research methods to the agile framework. Lean UX provides a suitable approach to integrate such user research methods for use in agile product development. This approach has the potential to improve the user experience.
... The requirements change as business needs change; time to market, budget, quality, or maintainability concerns increase the level of challenge faced by software companies. To cope with these challenges, companies turn to modern software development processes such as agile development [2], [16]. These processes are known to focus on managing time to market constraints and the ability to accommodate changes during the software development life cycle [3]. ...
Preprint
Full-text available
One factor of success in software development companies is their ability to deliver good quality products, fast. For this, they need to improve their software development practices. We work with a medium-sized company modernizing its development practices. The company introduced several practices recommended in agile development. If the benefits of these practices are well documented, the impact of such changes on the developers is less well known. We follow this modernization before and during the COVID-19 outbreak. This paper presents an empirical study of the perceived benefit and drawback of these practices as well as the impact of COVID-19 on the company's employees. One of the conclusions, is the additional difficulties created by obsolete technologies to adapt the technology itself and the development practices it encourages to modern standards.
... The requirements change as business needs change; time to market, budget, quality, or maintainability concerns increase the level of challenge faced by software companies. To cope with these challenges, companies turn to modern software development processes such as agile development [4,22]. These processes are known to focus on managing time to market constraints and the ability to accommodate changes during the software development life cycle [5]. ...
... As the most popular Agile method, Scrum focuses solely on project management. Extreme Programming (XP), on the other hand, puts its emphasis on software development [8]. Other Agile methods such as Feature Driven Development (FDD), Test Driven Development (TDD), Crystal Family, Agile Modeling, etc. have particular practices to solve specific challenges in software projects [7]. ...
... The traditional models like waterfall, Rational Unified Process (RUP), spiral and incremental models do not suffice the current software functionality requirements because of which popularity of Agile methods have increased [3,4,6]. It is observed thatCarnegie Mellon University"s Software Engineering Institute (SEI) has developed a series of software technology with architecture-centric methods that are used for architecture design and analysis [9]. Such methods have been attempted to fit into software development processes that are in demand. ...
Article
Full-text available
There have been a number of software development processes. Based on their advantages and disadvantages, they are accepted and utilized. But none of the development process can be claimed to be suitable to all the projects as the development context/scenarios differ. Therefore, there is a need to model the software development process based on the ongoing situations. In this paper, we define and develop a software development process that uses Micro Service Architecture and DevOp culture for Extreme Programming (XP). The proposed software development process can be used for large, complex and geographically distributed software system. A literature review was conducted to understand the currently used software development process and architectures in extreme programming. Nevertheless, in practice, theusage of extreme programming in large scale companies/systems is not known. This study aims to evaluate the impact of the usage ofXP process on the development of large-scale distributed systems, while taking online shopping services as a case study. A case study is conducted in an organization that develops software that are large-size and complex by modifying the extreme programming. Questionnaires were prepared and qualitative analysis was carried out. The case study aided in learning about the effectiveness of combining XP with DevOp. Further, outcomes of Crystal-Clear Methodology that depends on people rather than processes and Extreme Programming were compared. As a result of this process, XP can handle large, complex and geographicaldistributed software systems. Developed software becomes much faster, cost effective, loosely coupled, deployable across the globe. It is observed that practices of Extreme Programming when adaptedin the project gives rise to output like people factor and also helpscreating ideas and solutionsfor complex design issues. This is the approach for conceptualization and implementation of overall systems.
... The advantages include the location independence and the use of your own familiar development environment [37]. Furthermore, advantages that result from the initial idea of pair programming still apply, e.g., developers can share knowledge and collaboratively device how to develop [38]. ...
Article
Full-text available
Context: In software visualization research, various approaches strive to create immersive environments by employing extended reality devices. In that context, only few research has been conducted on the effect of collaborative, i.e., multi-user, extended reality environments. Objective: We present our journey toward a web-based approach to enable (location-independent) collaborative program comprehension using desktop, virtual reality, and mobile augmented reality devices. Method: We designed and implemented three multi-user modes in our web-based live trace visualization tool ExplorViz. Users can employ desktop, mobile, and virtual reality devices to collaboratively explore software visualizations. We conducted two preliminary user studies in which subjects evaluated our VR and AR modes after solving common program comprehension tasks. Results: The VR and AR environments can be suitable for collaborative work in the context of program comprehension. The analyzed feedback revealed problems regarding the usability, e.g., readability of visualized entities and performance issues. Nonetheless, our approach can be seen as a blueprint for other researchers to replicate or build upon these modes and results. Conclusions: ExplorViz’s multi-user modes are our approach to enable heterogeneous collaborative software visualizations. The preliminary results indicate the need for more research regarding effectiveness, usability, and acceptance. Unlike related work, we approach the latter by introducing a multi-user augmented reality environment for software visualizations based on off-the-shelf mobile devices.
... Such automation usually happens in the context of Continuous Integration (CI) practice [4]. With the perceived agility brought by the use of CI, the automated deployment appears as to be an additional step. ...
Preprint
Full-text available
Context: As the adoption of continuous delivery practices increases in software organizations, different scenarios struggle to make it scales for their products in long-term evolution. This study looks at the concrete software architecture as a relevant factor for successfully achieving continuous delivery goals. Objective: This study aims to understand how the design of software architectures impacts the continuous deployment of their software product. Method: We conducted a systematic literature review to identify proper evidence regarding the research objective. We analyzed the selected sources adopting a synthesis and analysis approach based on Grounded Theory. Results: We selected 14 primary sources. Through our analysis process, we developed a theory that explains the phenomenon of Architecting for Continuous Deployment. The theory describes three other phenomena that support Architecting for Continuous Deployment: Supporting Operations, Continuous Evolution, and Improving Deployability. Furthermore, the theory comprises the following elements: contexts, actions and interactions, quality attributes, principles, and effects. We instantiated these elements and identified their interrelationships. The theory is supported by providing bi-directional traceability from the selected sources to the elements and vice-versa. Conclusions: Developing adequate architecture plays a crucial role in enabling continuous delivery. Supporting operations becomes vital to increase the deployability and monitorability of software architecture. These two outcomes require that developers accept responsibility for maintaining the operations. The continuous evolution of the architecture is essential, but it must consider balanced management of technical debt. Finally, improving deployability requires attention to the test strategy and how it affects downtime to enable efficient pipelines.
... The solution consists on an agile-based software [7], where the delivered designs are more accurate and closer to use as case scenarios, the requirements are formally specified and the product backlog is created to display a step by step view of the processes involved in the creation of the software tool. All these processes and developments are determined by the priority set during customer feedback. ...
Article
Full-text available
This article presents the design, development and implementation of a software tool, serving as an alternative to the problems involving management, control and reporting of processes within the institutional plan for environmental management (known as plan institucional de gestión ambiental (PIGA) by its Spanish acronym) for the Universidad Distrital Francisco José de Caldas. The software is focused on carrying out such processes to the automation setting, based on the extreme programming (XP) Agile methodology that mainly centers on the continuous development of the customer requirements to offer a more assertive tool, in line with the plan institucional de gestión ambiental in Spanish (PIGA) processes. The result is a complete satisfaction of users and a highly usable, adaptable and efficient software, inherently optimizing and automating the environmental management processes of the PIGA program. This work delivers an applet that meets the design and implementation requirements of environmental management policies. The proposed tool manages to reduce process-related times by 97%, therefore, allowing to aim efforts in other missional functions and increase the overall value offer of the organization.
... Co-design is useful when the outcome should be as close to the eventual user's reality as possible. We combine Co-Design with design science research (DSR) (Hevner and Chatterjee 2010), and agile methodologies (Beck and Andres 2004;Sutherland and Schwaber 2013). Both co-design and DSR state background theories as a vital element. ...
Article
Full-text available
Designing digital interaction for people facing the end-of-life at an early or middle adult life is a challenging task. The user, who may be a person of similar age, culture and social status as the designers, is nevertheless living in a reality nothing short of alien to them. For the designer, approaching the users and considering their circumstances their reality is extremely stressful. A theoretical framework is built to help the designers. Two psychological theories that address the end-of-life have been fused together through the Grounded Theory paradigm. The first theory is the Erikson's Stages of Psychosocial Development, focusing primarily on the ninth stage. The second theory is the Kübler-Ross's Five Stages of Grief, taken in her original, non-sequential manner describing a person's grief over their own demise (preparatory grief) rather than more general grief. Co-Design, Agile and Design Science Research are brought together with this theoretical framework to assist the user to face their own death and to realistically appreciate that reality, which gives the designers solid ground on which to stand, when facing this ultimate application area. The outcome is a framework of 13 categories of human desires at end-of-life, accompanied with conceptual ideas of how to meet these desires with digital solutions.
... One student would act as the "Driver" while the other as the "Navigator" role. Studies suggest that pair programming was effective both in the normal in-person setting and remote setting [2], [3]. When applied correctly, pair programming would also be beneficial in both industry [4] and education settings [5]- [9]. ...
Article
Full-text available
Pair programming is an old yet effective method in solving programming problems in the computer science field. Here, at Institut Teknologi Del, we employed this method in the early programming courses to help the students master the learning objectives and practice them by solving problems in a collaborative manner. The COVID-19 pandemic was started in early 2020, it threatened almost every sector in our lives including education. Students were sent home and the learning process was forced to switch from in-person to distancing learning. This situation was a true challenge for our traditional pair programming approach where the students used to work together side-by-side. We then adapt the approach to fit the distributed setting. In this paper, we would like to share our experience in applying distributed pair programming, the problems we faced, and the strategies we employed to achieve the highest learning impacts.
... As digitalisation and digital transformation have become a common, often ubiquitous, part of our society, we argue the need for new approaches to project management to adapt to a situation of rapid technological development and flexibility. This flexibility is evident in agile project methods, such as eXtreme Programming (Beck and Andres 2005) and Scrum (Schwaber and Beedle 2002), although project management methods developed during the 1960s and are still in use. In 1969, the U.S. Project Management Institute published A Guide to the Project Management Body of Knowledge (PMBOK Guide) which has served as the standard for project management since that time (Tonnquist 2018). ...
Book
Book Description With the widespread transformation of information into digital form throughout society – firms and organisations are embracing this development to adopt multiple types of IT to increase internal efficiency and to achieve external visibility and effectiveness – we have now reached a position where there is data in abundance and the challenge is to manage and make use of it fully. This book addresses this new managerial situation, the post-digitalisation era, and offers novel perspectives on managing the digital landscape. The topics span how the post-digitalisation era has the potential to renew organisations, markets and society. The chapters of the book are structured in three topical sections but can also be read individually. The chapters are structured to offer insights into the developments that take place at the intersection of the management, information systems and computer science disciplines. It features more than 70 researchers and managers as collaborating authors in 23 thought-provoking chapters. Written for scholars, researchers, students and managers from the management, information systems and computer science disciplines, the book presents a comprehensive and thought-provoking contribution on the challenges of managing organisations and engaging in global markets when tools, systems and data are abundant. ---------------------------------------------------------------------------------------- Table of Contents Preface Peter Ekman, Peter Dahlin and Christina Keller Foreword Fredrik Nilsson and Fredrik Tell 1. Perspectives on Management and Information Technology after Digital Transformation Peter Ekman, Peter Dahlin and Christina Keller 2. Digital Transformation: Towards a New Perspective for Large Established Organisations in a Digital Age Alan W. Brown Part 1 – The transformation of society and markets 3. Managing Digital Servitization: A Service Ecosystem Perspective David Sörhammar, Bård Tronvoll and Christian Kowalkowski 4. Caught on the platform or jumping onto the digital train: Challenges for industries lagging behind in digitalisation Peter Ekman, Magnus Berglind and Steven Thompson 5. Digitalisation for Sustainability: Conceptualisation, Implications and Future Research Directions Elena Anastasiadou, Linda Alkire and Jimmie Röndell 6. Reaching New Heights in the Cloud: The Digital Transformation of the Video Games Industry Kevin Walther and David Sörhammar 7. Hyper-Taylorism and Third-order Technologies: Making Sense of the Transformation of Work and Management in a Post-digital Era Christoffer Andersson, Lucia Crevani, Anette Hallin, Caroline Ingvarsson, Chris Ivory, Inti José Lammi, Eva Lindell, Irina Popova and Anna Uhlin 8. Why Space is Not Enough: Service innovation and service delivery in senior housing Petter Ahlström, Göran Lindahl, Markus Fellesson, Börje Bjelke and Fredrik Nilsson 9. Challenges in Implementing Digital Assistive Technology in Municipal Healthcare Ann Svensson, Linda Bergkvist, Charlotte Bäccman and Susanne Durst Part 2 – Managerial and organisational challenges 10. Modern project management: Challenges for the future Klas Sundberg, Birger Rapp and Christina Keller 11. Managing the Paradoxes of Digital Product Innovation Fredrik Svahn and Bendik Bygstad 12. When External Reporting Goes Social: New Conditions for Transparency and Accountability? Cecilia Gullberg 13. Robotic Process Automation and the Accounting Profession’s Extinction Prophecy Matthias Holmstedt, Fredrik Jeanson and Angelina Sundström 14. Managing Digital Employee-Driven Innovation: The Role of Middle-Level Managers and Ambidextrous Leadership Izabelle Bäckström and Peter Magnusson 15. Digital Gamification of Organisational Functions and Emergent Management Practices Edward Gillmore 16. Leveraging Digital Technologies in Enterprise Risk Management Jason Crawford and Jan Lindvall Section 3 – Framing digitalisation 17. The End of Business Intelligence and Business Analytics Matthias Holmstedt and Peter Dahlin 18. ‘Deleted User’: Signalling Digital Disenchantment in the Post-Digital Society Cristina Ghita, Claes Thorén and Martin Stojanov 19. The Role of Boundary-Spanners in the Post-Digitalised Multinational Corporation Henrik Dellestrand, Olof Lindahl and Jakob Westergren 20. The Effect of Digital Transformation on Subsidiary Influence in the Multinational Enterprise Noushan Memar, Ulf Andersson, Peter Dahlin and Peter Ekman 21. Understanding Information System Outsourcing in the Digital Transformation Era: The Business-relationship Triad View Cecilia Erixon and Peter Thilenius 22. Transforming the Management/Profession Divide: The Use of the Red–Green Matrix in Swedish Schools Anton Borell, Johan Klaassen, Roland Almqvist and Jan Löwstedt 23. Integrating research in master’s programmes: Developing students’ skills to embrace digitally transformed markets Todd Drennan, Cecilia Thilenius Lindh and Emilia Rovira Nordman Index ---------------------------------------------------------------------------------------- Editor(s) Biography Peter Ekman is an associate professor of marketing at Mälardalen University and deputy dean of the Swedish Research School of Management and IT hosted by Uppsala University. His research focuses on firm digitalisation within business networks and service ecosystems, often in a global sustainability or globalizationn context. Peter Dahlin is an associate professor of business at Mälardalen University, Sweden, and honorary visiting scholar at the University of Exeter, UK, and is affiliated to the Business School. His research interests include applied analytics, network analysis and business performance. Christina Keller is the dean of the Swedish Research School of Management and IT at Uppsala University and professor in informatics at Lund University School of Economics and Management. Her main research interests include online learning, design science research and information systems in healthcare.
... The development process can be broadly described as agile (Beck and Andres 2004;James and Shane 2008), in which short-term goals are defined and implemented, and subsequently further refined based on new feedback from users; agile teams provide for short release cycles and continuous improvement to the software (Fig. 1b). Progress is tracked using a version control system with built-in issue tracking software (GitHub, https://github.com/). ...
Article
Full-text available
Modern breeding methods integrate next-generation sequencing (NGS) and phenomics to identify plants with the best characteristics and greatest genetic merit for use as parents in subsequent breeding cycles to ultimately create improved cultivars able to sustain high adoption rates by farmers. This data-driven approach hinges on strong foundations in data management, quality control, and analytics. Of crucial importance is a central database able to 1) track breeding materials, 2) store experimental evaluations, 3) record phenotypic measurements using consistent ontologies, 4) store genotypic information, and 5) implement algorithms for analysis, prediction and selection decisions. Because of the complexity of the breeding process, breeding databases also tend to be complex, difficult, and expensive to implement and maintain. Here, we present a breeding database system, Breedbase (https://breedbase.org/). Originally initiated as Cassavabase (https://cassavabase.org/) with the NextGen Cassava project (https://www.nextgencassava.org/), and later developed into a crop-agnostic system, it is presently used by dozens of different crops and projects. The system is web-based and is available as open source software. It is available on GitHub (https://github.com/solgenomics/) and packaged in a Docker image for deployment (https://dockerhub.com/breedbase/). The Breedbase system enables breeding programs to better manage and leverage their data for decision making within a fully integrated digital ecosystem. Availability https://github.com/solgenomics, https://hub.docker.com/r/breedbase/breedbase.
... "Strategic Project Leadership (SPL)" is an integrated approach to inspire project managers to be business leaders of their projects addressing the projects uncertainty in a dynamic and flexible way engaging and empowering people and teams to instill energy, gain commitment and to influence innovative and successful project outcomes (Shenhar, 2015). (Beck & Andres, 2004), Dynamic systems development method -DSDM (Voigt, 2004), Feature Driven Development -FDD (Fridaus, 2014), Crystal clear, Crystal orange, and Crystal orange web methodologies, which rely on incremental development cycles, wide communication flow, and good collaboration (Cockburn, 2002), the project management diamond approach to successful growth and innovation (Shenhar & Dvir, 2007) and ...
Projects start with an idea, which is deployed into a set of actions to deliver expected or foreseen outcomes, and they should be carried out in a cohesive and uni¯ed context. In this paper, contributions are devised to integrate project management and technology transfer practices to drive the transformation of knowledge into innovation within the scope of research and development projects at universities and research laboratories or centers. These contributions are presented in the form of a conceptual model that includes three interrelated groups. The first group is focused on project ideas' definition and selection, the second group on project management, and the third group on knowledge and technology transfer, these groups are supported by improvement, innovation and project management guidelines and standards. To obtain insights to define the model, quantitative and qualitative methods have been used and the data was processed using content analysis and descriptive and inferential techniques meant to ensure the complementarity of the data to obtain a holistic understanding of the practices under analysis. The approach was exploratory and descriptive, but also analytical in the sense that it states the issues at stake to support project management and technology transfer.
... According to [3], the agile approaches that stand out the most in the software industry today are: Scrum [4], Extreme Programming (XP) [5], Crystal Clear [6], Lean Software Development (LSD) [7], Adaptive Software Development (ASD) [8], Dynamic Systems Development Method (DSDM) [9], Feature Driven Development (FDD) [10], Agile Unified Process (Agile UP) [11], Kanban [12], among others. On the other hand, among the most popular traditional approaches are: Rational Unified Process (RUP) [13], Microsoft Solutions Framework (MSF) [14], Capability Maturity Model Integration (CMMI) [15], Model-Based Architecture and Software Engineering (MBASE) [16], among others. ...
Article
Full-text available
DevOps has emerged as an approach to help organizations automate, cost optimization, increase profitability, improve the stability of the software development process and the responsiveness of organizations, and create a more agile development and release pipeline. However, its adoption, maintenance and evaluation continue to be a challenge for software organizations, due to the absence of solutions that formalize process elements in a detailed way, such as: practices, roles, artifacts, objectives, among others. This paper presents a DevOps Model, this model to support the adoption of DevOps, which provides a set of fundamental and complementary values, principles, dimensions, and practices. The practices suggest a set of items such as purpose, specific objectives and expected artifacts. The elements defined in proposed DevOps Model arise from the elements found in the studies analyzed through a systematic mapping study. Model evaluation was carried out through a software development company as case study. The results obtained have allowed the case study company to evaluate, diagnose and identify improvement opportunities to be carried out in the processes and projects where a DevOps-based approach is used, the above in a practical, useful, and adequate way that allows this type of companies and with a low use of resources, both economic investment and time. This is how the DevOps Model could guide professionals and organizations towards a better understanding of DevOps, in addition to minimizing the subjectivity and error of its interpretation, adoption and evaluation.
... This is based on both coding and testing, while integrating customers in the process. It encourages simplicity, feedbacks (system, clients, team), and embraces change (Beck and Andres, 2004). XP presents a process with fast small releases and a continuous integration (teams are always synced) using feedback techniques such as pair programming where two programmers work in the same code simultaneously to reduce errors and increase speed (Zannier et al., 2004). ...
Thesis
This thesis explores new systems engineering design needs for evolutive systemarchitectures (eSAR), which are a subset of a new generation of complex hardware-basedsystems, within a context defined by global design stressors such as resource scarcity, andcomplexity. These evolutive system are highly adaptable, aiming towards resourceregeneration, and presenting a highly intelligent baseline. Based upon an extensive literaturereview highlighting key gaps on state-of-the-art design engineering and system engineeringtechniques, a full cycle evolutive development methodology (eSARD) is presented inspiredby natural evolution mechanisms while addressing heritage, and better systemperformances. The holistic eSARD method tackles design, implementation, systemoperations, and overall system optimization of an eSAR.
... Contributions to agility and reactiveness of product development are known from Agile (Beck et al. 2001), Lean (Gautam and Singh 2008;Ries 2011), and User-centered Design (Norman 1986;Gothelf 2013) methodologies. Dealing with certain levels of uncertainties can be seen from different agile practices, such as short development cycles, collaborative decision-making, rapid feedback loops, and continuous integration enable software organizations to address change effectively (Highsmith and Cockburn 2001;Beck and Andres 2004). In startup contexts, Giardino et al. showed that agile practices are adopted, but in an ad-hoc manner ). ...
Article
Full-text available
ContextSoftware startups are an essential source of innovation and software-intensive products. The need to understand product development in startups and to provide relevant support are highlighted in software research. While state-of-the-art literature reveals how startups develop their software, the reasons why they adopt these activities are underexplored.Objective This study investigates the tactics behind software engineering (SE) activities by analyzing key engineering events during startup journeys. We explore how entrepreneurial mindsets may be associated with SE knowledge areas and with each startup case.Method Our theoretical foundation is based on causation and effectuation models. We conducted semi-structured interviews with 40 software startups. We used two-round open coding and thematic analysis to describe and identify entrepreneurial software development patterns. Additionally, we calculated an effectuation index for each startup case.ResultsWe identified 621 events merged into 32 codes of entrepreneurial logic in SE from the sample. We found a systemic occurrence of the logic in all areas of SE activities. Minimum Viable Product (MVP), Technical Debt (TD), and Customer Involvement (CI) tend to be associated with effectual logic, while testing activities at different levels are associated with causal logic. The effectuation index revealed that startups are either effectuation-driven or mixed-logics-driven.Conclusions Software startups fall into two types that differentiate between how traditional SE approaches may apply to them. Effectuation seems the most relevant and essential model for explaining and developing suitable SE practices for software startups.
... Os valores são cinco: comunicação, simplicidade, feedback, coragem e respeito. Nestes valores, estão fundamentados alguns princípios básicos: feedback rápido, simplicidade assumida, mudanças incrementais, compreensão às mudanças e qualidade do trabalho (BECK, 2004) ...
Article
O desenvolvimento ágil tem sido um assunto de interesse crescente na comunidade de desenvolvimento de software nos últimos anos. Ao comparar as metodologias ágeis, como o eXtreme Programming (XP), com as prescritivas, muitos autores listam o Rational Unified Process (RUP) com estas últimas. Esses trabalhos atualmente disponíveis retratam a versão 2003 do RUP, com a abordagem já considerada “clássica” das Melhores Práticas. Este artigo mostra que o RUP é um processo em constante evolução e, nesta versão 7.0, englobou nos seus “Conceitos Chave” a maior parte dos valores defendidos pelo Manifesto Ágil perante a comunidade de desenvolvimento de software.
... These definitions focus on people and how they communicate to reach a common goal or objective. Research evidence suggests that in software organizations, informal communication is essential for understanding and communicating about stakeholder values and needs [4]. Informal communication is interactive and includes unplanned interactions that occur in the midst of daily activities [46] [13]. ...
Preprint
Building a shared understanding of non-functional requirements (NFRs) is a known but understudied challenge in requirements engineering, especially in organizations that adopt continuous software engineering (CSE) practices. During the peak of the COVID-19 pandemic, many CSE organizations complied with working remotely due to the imposed health restrictions; some continued to work remotely while implementing business processes to facilitate team communication and productivity. In remote CSE organizations, managing NFRs becomes more challenging due to the limitations to team communication coupled with the incentive to deliver products quickly. While previous research has identified the factors that lead to a lack of shared understanding of NFRs in CSE, we still have a significant gap in understanding how CSE organizations, particularly in remote work, build a shared understanding of NFRs in their software development. We conduct a three-month ethnography-informed case study of a remote CSE organization. Through thematic analysis of our qualitative data from interviews and observations, we identify a number of practices in developing a shared understanding of NFRs. The collaborative workspace the organization uses for remote interaction is Gather, which simulates physical workspaces, and which our findings suggest allows for informal communications instrumental for building shared understanding. As actionable insights, we discuss our findings in light of proactive practices that represent opportunities for software organizations to invest in building a shared understanding of NFRs in their development.
... Continuous integration (CI) is a development practice which has its roots in extreme programming which is an agile software development process [106]. CI focuses on frequent builds of the software which is under development. ...
Thesis
A variety of products undergo a transformation from a pure mechanical design to more and more software and electronic components. A polarized example are watches. Several decades ago they have been purely mechanical. Modern smart watches are almost completely electronic devices which heavily rely on software. Further, a smart watch offers a lot more features than just the information about the current time. This change had a crucial impact on how software is being developed. A first attempt to control the rising complexity was to move to agile development practices such as extreme programming or scrum. This rise in complexity is not only affecting the development process but also quality assurance and software testing. If a product contains more and more features then this leads to a higher number of tests necessary to ensure quality standards. Furthermore agile development practices work in an iterative manner which leads to repetitive testing that puts more effort on the testing team. We aimed within the thesis to ease the pain of testing. Thereby we examined a series of subproblems that arise. A key complexity is the number of test cases. We intended to reduce the number of test cases before they are executed manually or implemented as automated tests. Thereby we examined the test specification and based on the requirements coverage of the individual tests, we were able to identify redundant tests. We relied on a novel metaheuristic called GCAIS which we improved upon iteratively. Another task is to control the remaining complexity. Testing is often time crucial and an appropriate subset of the available tests must be chosen in order to get a quick insight into the status of the device under test. We examined this challenge in two different testing scenarios. The first scenario is located in semi-automated testing where engineers execute a set of automated tests locally and closely observe the behaviour of the system under test. We extended GCAIS to compute test suites that satisfy different criteria if provided with sufficient search time. The second use case is located in fully automated testing in a continuous integration (CI) setting. CI focuses on frequent software build cycles which also include testing. These builds contain a testing stage which greatly emphasizes speed. Thus there we also have to compute crucial tests. However, due to the nature of the process we have to continuously recompute a test suite for each build as the software and maybe even the test cases at hand have changed. Hence it is hard to compute the test suite ahead of time and these tests have to be determined as part of the CI execution. Thus we switched to a computational lightweight learning classifier system (LCS) to prioritize and select test cases. We integrated a series of innovations we made into an LCS known as XCSF such as continuous priorities, experience replay and transfer learning. This enabled us to outperform a state of the art artificial neural network which is used by companies such as Netflix. We further investigated how LCS can be made faster using parallelism. We developed generic approaches which may run on any multicore computing device. This is of interest for our CI use case as the build server's architecture is unknown. However, the methods are also independent of the concrete LCS and are not linked to our testing problem. We identified that many of the challenges that need to be faced in the CI use case have been tackled by Organic Computing (OC), for example the need to adapt to an ever changing environment. Hence we relied on OC design principles to create a system architecture which wraps the LCS developed and integrates it into existing CI processes. The final system is robust and highly autonomous. A side-effect of the high degree of autonomy is a high level of automatization which fits CI well. We also gave insight on the usability and delivery of the full system to our industrial partner. Test engineers can easily integrate it with a few lines of code and need no knowledge about LCS and OC in order to use it. Another implication of the developed system is that OC's ideas and design principles can also be employed outside the field of embedded systems. This shows that OC has a greater level of generality. The process of testing and correcting found errors is still only partially automated. We make a first step into automating the entire process and thereby take an analogy to the concept of self-healing of OC. As a first proof of concept of this school of thought we take a look at touch interfaces. There we can automatically manipulate the software to fulfill the specified behaviour. Thus only a minimalistic amount of manual work is required.
... Further, architectural changes drives extensive refactoring and results in subsequent merge conflicts. Despite the fact that this is an inherent feature of XP -"XP is a lightweight methodology for small-to-medium-sized teams developing software in the face of vague or rapidly changing requirements" [2] -it is none the less something we noted as a systematic cause of refactoring and merge conflicts. E.g.: ...
Preprint
Full-text available
Objective: The purpose of this paper is to identify the largest cognitive challenges faced by novices developing software in teams. Method: Using grounded theory, we conducted an ethnographic study for two months following four ten person novice teams, consisting of computer science students, developing software systems. Result: This paper identifies version control and merge operations as the largest challenge faced by the novices. The literature studies reveal that little research appears to have been carried out in the area of version control from a user perspective. Limitations: A qualitative study on students is not applicable in all contexts, but the result is credible and grounded in data and substantiated by extant literature. Conclusion: We conclude that our findings motivate further research on cognitive perspectives to guide improvement of software engineering and its tools.
... On this basis of Agile Values, several Agile methods and frameworks for Agile software development emerged, such as Scrum (Schwaber, 2004), Kanban (Anderson, 2010), or Extreme Programming (XP) (Beck and Andres, 2007). These partly served as a template for the Agile manifesto and were thus developed beforehand. ...
Thesis
Context. Agile methods are increasingly being used by companies, to develop digital products and services faster and more effectively. Today's users not only demand products that are easy to use, but also products with a high User Experience (UX). Agile methods themselves do not directly support the development of products with a good user experience. In combination with UX activities, it is potentially possible to develop a good UX. Objective. The objective of this PhD thesis is to develop a UX Lifecycle, to manage the user experience in the context of Agile methods. With this UX Lifecycle, Agile teams can manage the UX of their product, in a targeted way. Method. We developed the UX Lifecycle step by step, according to the Design Science Research Methodology. First, we conducted a Structured Literature Review (SLR) to determine the state of the art of UX management. The result of the SLR concludes in a GAP analysis. On this basis, we derived requirements for UX management. These requirements were then implemented in the UX Lifecycle. In developing the UX Lifecycle, we developed additional methods (UX Poker, UEQ KPI, and IPA), to be used when deploying the UX Lifecycle. Each of these methods has been validated in studies, with a total of 497 respondents from three countries (Germany, England, and Spain). Finally, we validated the UX Lifecycle, as a whole, with a Delphi study, with a total of 24 international experts from four countries (Germany, Argentina, Spain, and Poland). Results. The iterative UX Lifecycle (Figure 1) consists of five steps: Initial Step 0 ‘Preparation’, Step 1 ‘UX Poker’ (before development/Estimated UX), Step 2 ‘Evaluate Prototype’ (during development/Probable UX), Step 3 ‘Evaluate Product Increment’ (after development/Implemented UX), and a subsequent Step 4 ‘UX Retrospective’. With its five steps, the UX Lifecycle provides the structure for continuously measuring and evaluating the UX, in the various phases. This makes it possible to develop the UX in a targeted manner, and to check it permanently. In addition, we have developed the UX Poker method. With this method, the User Experience can be determined by the Agile team, in the early phases of development. The evaluation study of UX Poker has indicated that UX Poker can be used to estimate the UX for user stories. In addition, UX Poker inspires a discussion about UX, that results in a common understanding of the UX of the product. To interpret the results from the evaluation of a prototype and product increment, we developed or derived the User Experience Questionnaire KPI and Importance-Performance Analysis. In a first study, we were able to successfully apply the two methods and, in combination with established UEQ methods, derive recommendations for action, regarding the improvement of the UX. This would not have been possible without their use. The results of the Delphi study, to validate the UX Lifecycle, reached consensus after two rounds. The results of the evaluation and the comments lead to the conclusion, that the UX Lifecycle has a sufficiently positive effect on UX management. Conclusion. The goal-oriented focus on UX factors and their improvement, as propagated in the UX Lifecycle, are a good way of implementing UX management in a goal-oriented manner. By comparing the results from UX Poker, the evaluation of the prototype, and product increment, the Agile team can learn more about developing a better UX, within a UX retrospective. The UX Lifecycle will have a positive effect on UX management. The use of individual components of the UX Lifecycle, such as UX Poker or Importance-Performance Analysis, already helps an Agile team to improve the user experience. But only in combination with the UX Lifecycle and the individual methods and approaches presented in this PhD thesis, is a management of the user experience in a targeted manner possible, in our view. This was the initial idea of this PhD thesis, which we are convinced we could implement.
... The proposed browser is also inspired in part by pair programming [11], in which two programmers work in the same workspace, one writing code, while the other watches and reviews the typed code. The two programmers typically work on the same project. ...
Chapter
Refactoring is an essential agile practice; microservices are a currently trending implementation approach for service-oriented architectures. While program-internal code refactoring is well established, refactoring components on the architectural level has been researched but not adopted widely in practice yet. Hence, refactoring service Application Programming Interfaces (APIs) is not understood well to date. As a consequence, practitioners struggle with the evolution of APIs exposed by microservices. To overcome this problem, we propose to switch the refactoring perspective from implementation to integration and study how refactorings can be applied to the problem domain of agile service API design and evolution. We start with an empirical analysis and assessment of the state of the art and the practice. The contributions of this paper then are: 1) presentation of results from a practitioner survey on API change and evolution, 2) definitions for a future practice of API refactoring and 3) a candidate catalog of such API refactorings. We discuss these contributions and propose a research action plan as well.
Chapter
The Internet of things and services (IOT/IOS) as well as Industrial Internet and Industry 4.0 assume networked products, systems, and service in the future. The value proportion of electronics and software will continually increase with these kinds of products and embedded services. When products communicate with one another over the Internet, we refer Cyberphysical Systems or Cybertronic Systems. The development of these new systems will bring several consequences: interdisciplinary, regional, and organizationally distributed and integrated product development, a rethinking of current construction methods, processes, IT solutions, and organizational forms as well as the demand for consistent process chains based on digital models in the requirement definition, system architecture, product development, simulation, product planning, production, and service. Furthermore, planning and design procedures of all disciplines—mechanical, electronic, and software—must be put to the test and their suitability for a new process model for product, system, and service development checked in order to transit them to a common, integrated, and interdisciplinary method, process, and IT solution approach. This approach to the digitalization of product development is called Engineering 4.0. The methodologies of systems engineering (SE), model-based systems engineering (MBSE) and systems thinking form the foundations. Digitalization of products and product development means a transformation process which rearranges the classic limits of a fragmented and competitive IT solution world, a departure from silo thinking to a consistent, integrational solution approach for engineering. A lightweight and federated engineering backbone (→ System Lifecycle Management, SysLM) will take on the role of data and process integration for the entire product lifecycle, including operations. This chapter will present the foundations, framework conditions, and drivers of digitalization and derive an adjusted construction methodology from the results of this for the development of cybertronic products and systems.
Book
Full-text available
É com grande satisfação que apresentamos este livro-texto da primeira ERCEMAPI (Escola Regional de Computação – Ceará, Maranhão e Piauí) que se realiza em Fortaleza no Campus da UNIFOR. As Escolas Regionais de Computação (ERC), promovidas pela SBC (Sociedade Brasileira de Computação) através de suas secretarias regionais, são eventos sem fins lucrativos que têm como objetivos principais: 1. Disseminar o conhecimento científico acerca de temas emergentes e relevantes ligados a computação. 2. Incentivar a produção de textos didáticos e técnicos de alta qualidade. A primeira ERCEMAPI está sendo organizada pela UNIFOR, UFC, CEFET-CE e FIC, conta com os patrocínios da SECITECE, FUNCAP, FAPEMA e Instituto Atlântico e com o apoio da UECE, UFMA e UFPI. Nesta edição, a ERCEMAPI, conta com mini-cursos elaborados por dez professores doutores ligados a Universidades da Região. O leitor encontrará aqui os textos de cada um dos mini-cursos ministrados. Cumpre salientar o elevado espírito de colaboração com que preside a realização da ERCEMAPI. Além da colaboração das instituições citadas ressalte-se a colaboração pessoal dos professores, que ministram os cursos sem remuneração, bem como dos diversos alunos que colaboram na organização do evento. Finalmente, gostaríamos de expressar formalmente o nosso agradecimento a todos os que contribuíram para tornar a ERCEMAPI um evento de sucesso.
Chapter
Das Internet der Dinge und Services (IOT/IOS) sowie Industrie 4.0 gehen in der Zukunft von vernetzten Produkten, Systemen und Dienstleistungen aus. Der wertmäßige Anteil an Elektronik und Software wird bei dieser Art von Produkten und eingebetteten Dienstleistungen kontinuierlich steigen. Kommunizieren Produkte miteinander über das Internet, wird von Cyber-Physical Systems bzw. Cybertronischen Systemen gesprochen. Die Entwicklung dieser neuen Systeme wird mehrere Konsequenzen nach sich ziehen: interdisziplinäre, regional und organisatorisch verteilte sowie integrierte Produktentwicklung, ein Überdenken heutiger Konstruktionsmethoden, Prozesse, IT-Lösungen und Organisationsformen sowie die Forderung nach durchgängigen Prozessketten, basierend auf digitalen Modellen in der Anforderungsdefinition, Systemarchitektur, Produktentwicklung, Simulation, Produktionsplanung, Produktion und Service. Weiterhin müssen Planungs- und Entwurfsmethoden aller Disziplinen – Mechanik, Elektronik und Software – auf den System-Prüfstand gestellt und ihre Tauglichkeit für ein neues Vorgehensmodell der Produkt-, System und Serviceentwicklung überprüft werden, um diese in einen gemeinsamen, integrierten und interdisziplinären Methoden-, Prozess- und IT-Lösungsansatz zu überführen. Dieser Ansatz der Digitalisierung der Produktentwicklung wird Engineering 4.0 genannt. Der hier verwendete Begriff Produktenwicklung bezieht sich sowohl auf die eigentlichen Produkte als auch auf die Produktionsmittel, denn diese sind auch Produkte im eigentlichen Sinne. Diese Begriffsdefinition lehnt sich an die Begriffsbestimmung von Ehrlenspiel an [29]. Damit umfasst der in diesem Buch verwendete Begriff der Produktentwicklung den von vielen Autoren [78, 93] verwendeten Begriff der Produktentstehung. Die Grundlagen bilden Methodiken des Systems Engineering (SE), des Model Based Systems Engineering (MBSE) und des Systems Thinking. Die Digitalisierung der Produkte und der Produktentwicklung bedeutet einen Transformationsprozess, der die klassischen Grenzen einer fragmentierten und konkurrierenden IT-Lösungswelt neu ordnet. Weg vom Silodenken zu einem durchgängigen und integrativen Lösungsansatz für das Engineering. Ein leichtgewichtiger und föderierter Engineering Backbone (→ System Lifecycle Management, SysLM) wird die Rolle der Daten- und Prozessintegration über den gesamten Produktlebenszyklus inklusive des operativen Betriebes einnehmen. In diesem Kapitel werden Grundlagen, Randbedingungen und Treiber der Digitalisierung vorgestellt und eine daraus resultierende für die Entwicklung von cybertronischen Produkten und Systemen angepasste Konstruktionsmethodik abgeleitet.
Article
Full-text available
Software engineering techniques have been employed for many years to create software products. The selections of appropriate software development methodologies for a given project, and tailoring the methodologies to a specific requirement have been a challenge since the establishment of software development as a discipline. In the late 1990's, the general trend in software development techniques has changed from traditional waterfall approaches to more iterative incremental development approaches with different combination of old concepts, new concepts, and metamorphosed old concepts.
Chapter
Today, companies are confronted with a dynamic, volatile, uncertain and complex environment. To cope with this situation, companies have to be agile. Agility is a key factor for a company’s success when a product’s and process’s complexity increases. Agile describes a set of values and principles, which is realized by the application of different practices, methods and tools. During the last years, the application of agile methods has increased continuously. The purpose of this research is to provide a systematic, explicit and reproducible literature overview regarding agile methods for complexity management. Different definitions for agile methods are described and a new overall definition for agile methods is presented. Furthermore, the existing agile methods are identified and analyzed according to their content and applicability for complexity management. The literature overview was done by systematically collecting and analyzing existing literature. The gap in the literature is pointed out. A general overview of agile methods for complexity management does not exist yet.
Article
Model Driven Engineering (MDE) is general-purpose engineering methodology to elevate system design, maintenance, and analysis to corresponding activities on models. Models (graphical and/or textual) of a target application are automatically transformed into source code, performance models, Promela files (for model checking), and so on for system analysis and construction. Models are instances of metamodels. One form an MDE metamodel can take is a [class diagram, constraints] pair: the class diagram defines all object diagrams that could be metamodel instances; OCL constraints eliminate semantically undesirable instances. A metamodel refactoring is an invertible semantics-preserving co-transformation, i.e. it transforms both a metamodel and its models without losing data. This paper addresses a subproblem of metamodel refactoring: how to prove the correctness of refactorings of class diagrams without OCL constraints using the Coq Proof Assistant.
Article
Developing software systems is a stakeholder-focused activity that needs to be cognizant of the priorities and constraints of the business, the users of the software, those who will need to maintain it, and the many involved in its timely, high-quality, and resource-conscious engineering. Software developers are among the key stakeholders in any software development project endeavor, yet they are also those whose needs are the most overlooked. I define a software developer as an individual who is knowledgeable about one or more programming languages and is capable of using the tools required to create a software functionality. Software developers may also be referred to as programmers, developers, and coders. In addition, all software engineers are expected to be trained in software development, while not all software developers are trained in software engineering or participate in engineering activities. Admittedly, this distinction is fuzzy at best. Software engineering skills include the ability to understand the requirements of the software to be developed, formulate the design and architecture of the software by understanding its tradeoffs, and understand the many activities that need to be executed for the successful delivery of the software. Software development and software engineering are often used interchangeably as well.
Article
Full-text available
Context With an increase in Agile, Lean, and DevOps software methodologies over the last years (collectively referred to as Continuous Software Development (CSD)), we have observed that documentation is often poor. Objective This work aims at collecting studies on documentation challenges, documentation practices, and tools that can support documentation in CSD. Method A systematic mapping study was conducted to identify and analyze research on documentation in CSD, covering publications between 2001 and 2019. Results A total of 63 studies were selected. We found 40 studies related to documentation practices and challenges, and 23 studies related to tools used in CSD. The challenges include: informal documentation is hard to understand, documentation is considered as waste, productivity is measured by working software only, documentation is out-of-sync with the software and there is a short-term focus. The practices include: non-written and informal communication, the usage of development artifacts for documentation, and the use of architecture frameworks. We also made an inventory of numerous tools that can be used for documentation purposes in CSD. Overall, we recommend the usage of executable documentation, modern tools and technologies to retrieve information and transform it into documentation, and the practice of minimal documentation upfront combined with detailed design for knowledge transfer afterwards. Conclusion It is of paramount importance to increase the quantity and quality of documentation in CSD. While this remains challenging, practitioners will benefit from applying the identified practices and tools in order to mitigate the stated challenges.
Article
Full-text available
The purpose of the process evaluation is to obtain relevant information in a qualitative and quantitative way about the current state in which a process is to support processes of evolution and continuous improvement. Scrum is one of the most used agile approaches, however, there some aspects that can hinder its implementation, e.g., lack of detail of artefacts and meetings that raises, including the timetables, application of the approach, among others. In this sense, and in order to facilitate the success in the implementation of Scrum, this paper presents EvaScrum, an assessment instrument that provides the opportunity to professionals and consultants to assess and diagnose the degree of implementation of Scrum through questions, metrics, a spreadsheet and a Web application. EvaScrum is based on Mr. Scrum; a reference model which provides a clear and complete set of process elements based on Scrum and EvaScrumTOOL, a web tool to manage the assessments. This paper presents the detailed analysis of two case studies in software development enterprises where EvaScrum was applied. The results obtained have allowed case study enterprises to identify improvement opportunities to carry out in the processes and projects where Scrum is being applied, all this, in a practical, useful and suitable manner which allow these types of enterprises, to assess and diagnose their implementations of Scrum with feasible resources both time and economically.
Chapter
The authors give a compact overview of the well-known Scrum project framework and the situations in which this agile project approach can be used best. They share best practice and tools from project experience to avoid common mistakes and get your project running. Beyond that, the authors give a short insight into alternative agile methods and which conditions they are best suited to.
Preprint
Full-text available
Effective leadership is one of the key drivers of business and project success, and one of the most active areas of management research. But how does leadership work in agile software development, which emphasizes self-management and self-organization and marginalizes traditional leadership roles? To find out, this study examines agile leadership from the perspective of thirteen professionals who identify as agile leaders, in different roles, at ten different software development companies of varying sizes. Data from semi-structured interviews reveals that leadership: (1) is dynamically shared among team members; (2) engenders a sense of belonging to the team; and (3) involves balancing competing organizational cultures (e.g. balancing the new agile culture with the old milestone-driven culture). In other words, agile leadership is a property of a team, not a role, and effectiveness depends on agile team members' identifying with the team, accepting responsibility, and being sensitive to cultural conflict.
Preprint
Full-text available
Chapter
The necessity to maintain the workflow in the context of the forced disunity of society caused a high demand for the services provided by software systems. It suddenly turned out that the quality of many widely advertised software products is much lower than expected. One of the reasons is that a different category of users with different quality criteria started to use these products. Another reason is that products, and especially Internet projects, have drastically decreased their quality to an unacceptably low level for various reasons. The quality decrease mostly happened due to errors in project management, in particular, due to an inadequate software development methodology. Taking a look at the scale and significance of a modern software project, the desire to execute it according to one of the classic models becomes clear. It would provide quality but increase development time. However, developers’ desire to stay ahead of competitors forces them to reduce the development time, albeit with a loss of quality. Traditionally, agile methodologies have been used in this case. They dramatically reduce the “unproductive” phases of the life cycle: the formation and analysis of requirements, planning, and testing. And the larger the Internet project, the higher the risk of quality loss and the higher the cost of the loss. Finally, the project might become unacceptable. What should be done in order to avoid this? How to combine the benefits of both approaches without getting their disadvantages?
Thesis
Software is fundamental to research. Seven out of ten researchers in the United Kingdom report that their work would be impossible without it [1]. Yet the scientific community faces a credibility crisis, and this is publicly known. The crisis is a multistakeholder problem where no single solution will suffice. It has been shown that code quality is strongly related to the quality of the scientific results. The work presented in this thesis pursues a two-fold strategy. On one hand, in response to the problem just outlined, it explores concepts from software engineering impacting quality in an academic context. This includes version control, software documentation, testing, continuous integration, or software distribution. To render this exploration concrete, a field of study was chosen and two case studies were performed. Like most of the computational sciences, micromagnetics benefits greatly from fast and accurate numerical tools. Thus on the other hand, this work contributes to the body of methodology for micromagnetic simulations. Two micromagnetic simulators have been developed in preparation of this thesis. One simulator is a finite element code called FinMag. The author created the initial implementation. The other simulator is a finite difference code called Fidimag. For Fidimag, the author participated in the initial implementation. The continuing development of these two packages is now a collaborative effort with many developers. The two simulators enabled published, novel, reproducible research on magnonics and the study of magnetic skyrmions. Further, the outcome of two numerical studies increasing the performance of the micromagnetic simulators are presented. [1] ”S.J. Hettrick et al, UK Research Software Survey 2014”, DOI:10.5281/zenodo.1183562<br/
Preprint
Full-text available
Due to the Covid 19 pandemic and the associated effects on the world of work, the burden on employees has been brought into focus. This fact also applies to agile software development teams in many companies due to the extensive switch to remote work. Too high a workload can lead to various negative effects, such as increased sick leave, the well-being of employees, or reduced productivity. It is also known that the workload in knowledge work impacts the quality of the work results. This research article identifies potential factors of the workload of the agile software development team members at Otto GmbH & Co KG. Based on the factors, we present measures to reduce workload and explain our findings, which we have validated in an experiment. Our results show that even small-scale actions, such as the introduction of rest work phases during the working day, lead to positive effects, for example, increased ability to concentrate and how these affect the quality of the work results.
Article
Full-text available
Context Software development companies use Agile methods to develop their products or services efficiently and in a goal-oriented way. But this alone is not enough to satisfy user demands today. It is much more important nowadays that a product or service should offer a great user experience—the user wants to have some positive user experience while interacting with the product or service. Objective An essential requirement is the integration of user experience methods in Agile software development. Based on this, the development of positive user experience must be managed. We understand management in general as a combination of a goal, a strategy, and resources. When applied to UX, user experience management consists of a UX goal, a UX strategy, and UX resources. Method We have conducted a systematic literature review (SLR) to analyse suitable approaches for managing user experience in the context of Agile software development. Results We have identified 49 relevant studies in this regard. After analysing the studies in detail, we have identified different primary approaches that can be deemed suitable for UX management. Additionally, we have identified several UX methods that are used in combination with the primary approaches. Conclusions However, we could not identify any approaches that directly address UX management. There is also no general definition or common understanding of UX management. To successfully implement UX management, it is important to know what UX management actually is and how to measure or determine successful UX management.
Conference Paper
Full-text available
The PIO C and Fortran libraries allow for high-performance I/O on HPC systems. These libraries are developed using software engineering techniques such as branch development, pull-requests, automated testing, continuous integration, portable releases which adapt to user installation conditions, and full documentation of code for users and developers. This paper details the use of software engineering techniques on the PIO software project.
Thesis
Full-text available
Impact Assessment (IA) has evolved to become a multidisciplinary tool aimed at increasing political accountability and promoting better policy decisions. Among other IA tools, Regulatory Impact Assessment (RIA) has gained prominence with its strategic and broad scope (covering agency regulations and all kinds of significant impacts) and structured method. Growing consensus on evidence-based policies as a requirement of good governance and the Better Regulation agenda have also helped propel the diffusion of RIA. A recent trend in RIA systems has been the adoption of an ex post RIA counterpart to the traditional ex ante RIA. In other words, RIA has started to look back. This dissertation examines this recent evolutionary step of IA and argues that while adopting the ex post complement is a step in the right direction, it is ultimately a step that falls short of truly fulfilling RIA normative goals. The foresight- hindsight divide between ex ante and ex post RIA exposes the system to the risk of missing the correct timing for policy adjustments, therefore failing to avoid unwanted welfare losses. Also, policy learning is limited. To overcome these problems, the dissertation proposes the idea of Adaptive Regulatory Impact Assessment (ARIA). The dissertation examines the benefits of ARIA, the limitations on ARIA posed by the fragmentation of IA tools, and the literature on the quality of ex ante and ex post RIA. Furthermore, it provides an overview of the techniques and tools that can make ARIA both feasible and promising.
Thesis
Full-text available
Os projetos de desenvolvimento de software, de maneira geral, vêm apresentando baixos índices de sucesso, no que diz respeito ao alcance das metas de prazo, custo e funcionalidade do produto. Estudos apontam que um dos motivos para o insucesso dos projetos deste segmento está relacionado à ausência de Gestão de Riscos ou a sua aplicação inadequada no âmbito do projeto. Todavia, percebe-se que a Gestão de Riscos vem sendo pouco utilizada, inclusive nos projetos de desenvolvimento de software. Os projetos de desenvolvimento de software vêm cada vez mais utilizando metodologias ágeis, especialmente o framework Scrum, que é a mais popular dentre elas. Entretanto, o Scrum, assim como as metodologias ágeis em geral, não apresenta atividades específicas para a realização da Gestão de Riscos. Diante deste cenário, esta dissertação apresenta os resultados de uma pesquisa aplicada utilizando a abordagem qualitativa, com o objetivo de analisar como é realizada a Gestão de Riscos em projetos Scrum. O método de pesquisa adotado foi o estudo de caso, realizado com colaboradores do Inatel Competence Center, que trabalham em projetos de desenvolvimento de software. O instrumento de pesquisa para coleta de dados foi desenvolvido baseando-se em artigos científicos sobre o tema de Gestão de Riscos em projetos de software, além de contar com a aplicação de entrevistas estruturadas. Como resultado, esta pesquisa apresenta as práticas de Gestão de Riscos e suas respectivas análises estatísticas. Destaca-se as práticas que obtiveram a maior concordância com a literatura como sendo: a aplicação da Gestão de Riscos de forma contínua em um loop de feedback e a reutilização do conhecimento sobre riscos por parte da equipe do projeto.
ResearchGate has not been able to resolve any references for this publication.