Conference Paper

AutoFlow: An Automatic Debugging Tool for AspectJ Software

Sch. of Software, Shanghai Jiao Tong Univ., Shanghai
DOI: 10.1109/ICSM.2008.4658109 Conference: Software Maintenance, 2008. ICSM 2008. IEEE International Conference on
Source: IEEE Xplore


Aspect-oriented programming (AOP) is gaining popularity with the wider adoption of languages such as AspectJ. During AspectJ software evolution, when regression tests fail, it may be tedious for programmers to find out the failure-inducing changes by manually inspecting all code editing. To eliminate the expensive effort spent on debugging, we developed AutoFlow, an automatic debugging tool for AspectJ software. AutoFlow integrates the potential of delta debugging algorithm with the benefit of change impact analysis to narrow down the search for faulty changes. It first uses change impact analysis to identify a subset of responsible changes for a failed test, then ranks these changes according to our proposed heuristic (indicating the likelihood that they may have contributed to the failure), and finally employs an improved delta debugging algorithm to determine a minimal set of faulty changes. The main feature of AutoFlow is that it can automatically reduce a large portion of irrelevant changes in an early phase, and then locate faulty changes effectively.

Download full-text


Available from: Jianjun Zhao, Sep 21, 2015
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: When regression tests fail unexpectedly after a long session of edit- ing, it may be tedious for programmers to find out the failure- inducing changes by manually inspecting all code edits. To elimi- nate the expensive effort spent on debugging, we present a hybrid approach, which combines both static and dynamic analysis tech- niques, to automatically identify the faulty changes. Our approach first uses static change impact analysis to isolate a subset of respon- sible changes for a failed test, then utilizes the dynamic test execu- tion information to rank these changes according to our proposed heuristic (indicating the likelihood that they may have contributed to the failure), and finally employs an improved Three-Phase delta debugging algorithm, working from the coarse method level to the fine statement level, to find a minimal set of faulty statements. We implemented the proposed approach for both Java and As- pectJ programs in our AutoFlow prototype. In our evaluation with two third-party applications, we demonstrate that this hybrid ap- proach can be very effective: at least for the subjective programs we investigated, it takes significantly (almost 4X ) fewer tests than the original delta debugging algorithm to locate the faulty code.
    Full-text · Conference Paper · Jan 2008
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Change impact analysis is a useful technique for soft- ware evolution. It determines the effects of a source edit- ing session and provides valuable feedbacks to the pro- grammers for making correct decisions. Recently, many techniques have been proposed to support change impact analysis of procedural or object-oriented software, but sel- dom effort has been made for aspect-oriented software. In this paper we propose a new change impact analysis tech- nique for AspectJ programs. At the core of our approach is the atomic change representation which captures the se- mantic differences between two versions of an AspectJ pro- gram. We also present an impact analysis model, based on AspectJ call graph construction, to determine the af- fected program fragments, affected tests and their respon- sible changes. The proposed techniques have been imple- mented in Celadon, a change impact analysis framework for AspectJ programs. We performed an empirical evalua- tion on 24 versions of eight AspectJ benchmarks. The result shows that our proposed technique can effectively perform change impact analysis and provide valuable information in AspectJ software evolution.
    Full-text · Conference Paper · Sep 2008
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Software change impact analysis (CIA) is a technique for identifying the effects of a change, or estimating what needs to be modified to accomplish a change. Since the 1980s, there have been many investigations on CIA, especially for code‐based CIA techniques. However, there have been very few surveys on this topic. This article tries to fill this gap. And 30 papers that provide empirical evaluation on 23 code‐based CIA techniques are identified. Then, data was synthesized against four research questions. The study presents a comparative framework including seven properties, which characterize the CIA techniques, and identifies key applications of CIA techniques in software maintenance. In addition, the need for further research is also presented in the following areas: evaluating existing CIA techniques and proposing new CIA techniques under the proposed framework, developing more mature tools to support CIA, comparing current CIA techniques empirically with unified metrics and common benchmarks, and applying the CIA more extensively and effectively in the software maintenance phase. Copyright © 2012 John Wiley & Sons, Ltd.
    Full-text · Article · Dec 2013 · Software Testing Verification and Reliability