Conference Paper

An empirical study of profiling strategies for released software and their impact on testing activities

DOI: 10.1145/1007512.1007522 Conference: Proceedings of the ACM/SIGSOFT International Symposium on Software Testing and Analysis, ISSTA 2004, Boston, Massachusetts, USA, July 11-14, 2004
Source: DBLP

ABSTRACT

An understanding of how software is employed in the field can yield many opportunities for quality improvements. Profiling released software can provide such an understanding. However, profiling released software is diffcult due to the potentially large number of deployed sites that must be profiled, the extreme transparency expectations, and the remote data collection and deployment management process. Researchers have recently proposed various approaches to tap into the opportunities and overcome those challenges. Initial studies have illustrated the application of these approaches and have shown their feasibility. Still, the promising proposed approaches, and the tradeoffs between overhead, accuracy, and potential benefits for the testing activity have been barely quantified. This paper aims to over-come those limitations. Our analysis of 1200 user sessions on a 155 KLOC system substantiates the ability of field data to support test suite improvements, quantifies different approaches previously introduced in isolation, and assesses the efficiency of profiling techniques for released software and the effectiveness of their associated testing efforts.

Full-text preview

Available from: psu.edu
  • Source
    • "AND DIEP: PROFILING DEPLOYED SOFTWARE: ASSESSING STRATEGIES AND TESTING OPPORTUNITIES 13 6. For other techniques based on operational profiles see [6] "
    [Show abstract] [Hide abstract]
    ABSTRACT: An understanding of how software is employed in the field can yield many opportunities for quality improvements. Profiling released software can provide such an understanding. However, profiling released software is difficult due to the potentially large number of deployed sites that must be profiled, the transparency requirements at a user's site, and the remote data collection and deployment management process. Researchers have recently proposed various approaches to tap into the opportunities offered by profiling deployed systems and overcome those challenges. Initial studies have illustrated the application of these approaches and have shown their feasibility. Still, the proposed approaches, and the tradeoffs between overhead, accuracy, and potential benefits for the testing activity have been barely quantified. This paper aims to overcome those limitations. Our analysis of 1,200 user sessions on a 155 KLOC deployed system substantiates the ability of field data to support test suite improvements, assesses the efficiency of profiling techniques for released software, and the effectiveness of testing efforts that leverage profiled field data.
    Preview · Article · May 2005 · IEEE Transactions on Software Engineering
  • Source
    • "AND DIEP: PROFILING DEPLOYED SOFTWARE: ASSESSING STRATEGIES AND TESTING OPPORTUNITIES 13 6. For other techniques based on operational profiles see [6] "
    [Show abstract] [Hide abstract]
    ABSTRACT: An understanding of how software is employed in the field can yield many opportunities for quality improvements. Profiling released software can provide such an understanding. However, profiling released software is difficult due to the potentially large number of deployed sites that must be profiled, the transparency requirements at a user’s site, and the remote data collection and deployment management process. Researchers have recently proposed various approaches to tap into the opportunities offered by profiling deployed systems and overcome those challenges. Initial studies have illustrated the application of these approaches and have shown their feasibility. Still, the proposed approaches, and the tradeoffs between overhead, accuracy, and potential benefits for the testing activity have been barely quantified. This paper aims to overcome those limitations. Our analysis of 1,200 user sessions on a 155 KLOC deployed system substantiates the ability of field data to support test suite improvements, assesses the efficiency of profiling techniques for released software, and the effectiveness of testing efforts that leverage profiled field data.
    Preview · Article · Jan 2005
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Testers use coverage data for test suite quality assessment, stopping criteria definition, and effort allocation. However, as the complexity of products and testing processes increases, the cost of coverage data collection may grow significantly, jeopardizing its potential application. We present two techniques to mitigate this problem based on the concept of "disposable coverage instrumentation": coverage instrumentation that is removed after its usage. The idea is to reduce coverage collection overhead by removing instrumentation probes after they have been executed. We have extended a Java virtual machine to support these techniques, and show their potential through empirical studies with the Specjvm98 and Specjbb2000 benchmarks. The results indicate that the techniques can reduce coverage collection overhead between 18% and 97% over existing techniques.
    Preview · Conference Paper · Dec 2004
Show more