ABSTRACT
Software engineering methodologies are subject to complex cost-benefit tradeoffs. Economic models can help practitioners and researchers assess methodologies relative to these tradeoffs. Effective economic models, however, can be established only through an iterative process of refinement involving analytical and empirical methods. Sensitivity analysis provides one such method. By identifying the factors that are most important to models, sensitivity analysis can help simplify those models; it can also identify factors that must be measured with care, leading to guidelines for better test strategy definition and application. In prior work we presented the first comprehensive economic model for the regression testing process, that captures both cost and benefit factors relevant to that process while supporting evaluation of these processes across entire system lifetimes. In this work we use sensitivity analysis to examine our model analytically and assess the factors that are most important to the model. Based on the results of that analysis, we propose two new models of increasing simplicity. We assess these models empirically on data obtained by using regression testing techniques on several non-trivial software systems. Our results show that one of the simplified models assesses the relationships between techniques in the same way as the full model.
- B. Boehm. Value-based software engineering. Softw. Eng. Notes, 28(2), Mar. 2003. Google ScholarDigital Library
- B. Boehm, C. Abts, A. W. Brown, S. Chulani, E. H. B. K. Clark, R. Madachy, D. Reifer, and B. Steece. Software Cost Estimation with COCOMO II. Prentice-Hall, 2000. Google ScholarDigital Library
- H. Do, S. Elbaum, and G. Rothermel. Supporting controlled experimentation with testing techniques: An infrastructure and its potential impact. Int'l. J. Emp. Softw. Eng., 10(4):405--435, 2005. Google ScholarDigital Library
- H. Do and G. Rothermel. An empirical study of regression testing techniques incorporating context and lifetime factors and improved cost-benefit models. In Int'l. Symp. Found. Softw. Eng., pages 141--151, Nov. 2006. Google ScholarDigital Library
- H. Do and G. Rothermel. On the use of mutation faults in empirical assessments of test case prioritization techniques. IEEE Trans. Softw. Eng., 32(9):733--752, 2006. Google ScholarDigital Library
- H. Do, G. Rothermel, and A. Kinneer. Prioritizing JUnit test cases: An empirical assessment and cost-benefits analysis. Emp. Softw. Eng.: An Int'l. J., 11(1):33--70, Jan. 2006. Google ScholarDigital Library
- S. Elbaum, D. Gable, and G. Rothermel. Understanding and measuring the sources of variation in the prioritization of regression test suites. In Int'l. Softw. Metrics Symp., pages 169--179, Apr. 2001. Google ScholarDigital Library
- S. Elbaum, A. G. Malishevsky, and G. Rothermel. Test case prioritization: A family of empirical studies. IEEE Trans. Softw. Eng., 28(2):159--182, Feb. 2002. Google ScholarDigital Library
- S. Elbaum, G. Rothermel, S. Kanduri, and A. G. Malishevsky. Selecting a cost-effective test case prioritization technique. Softw. Qual. J., 12(3):185--210, Sept. 2004. Google ScholarDigital Library
- B. Freimut, L. C. Briand, and F. Vollei. Determining inspection cost-effectiveness by combining project data and expert opinion. IEEE Trans. Softw. Eng., 31(12):1074--1092, Dec. 2005. Google ScholarDigital Library
- M. J. Harrold, D. Rosenblum, G. Rothermel, and E. Weyuker. Empirical studies of a prediction model for regression test selection. IEEE Trans. Softw. Eng., 27(3):248--263, Mar. 2001. Google ScholarDigital Library
- R. Johnson and D. Wichern. Applied Multivariate Statistical Analysis. Prentice Hall, 1992. Google ScholarDigital Library
- C. Jones. Applied Software Measurement: Assuring Productivity and Quality. McGraw-Hill, 1997. Google ScholarDigital Library
- J. Kim and A. Porter. A history-based test prioritization technique for regression testing in resource constrained environments. In Int'l. Conf. Softw. Eng., May 2002. Google ScholarDigital Library
- A. Kinneer, M. Dwyer, and G. Rothermel. Sofya: A flexible framework for development of dynamic program analysis for Java software. Technical Report TR-UNL-CSE-2006-0006, University of Nebraska-Lincoln, Apr. 2006.Google Scholar
- S. Kusumoto, K. Matsumoto, T. Kikuno, and K. Torii. A new metric for cost-effectiveness of software reviews. IEICE Trans. Info. Sys., E75-D(5):674--680, 1992.Google Scholar
- H. Leung and L. White. A cost model to compare regression test strategies. In Conf. Softw. Maint., Oct. 1991.Google ScholarCross Ref
- A. Malishevsky, G. Rothermel, and S. Elbaum. Modeling the cost-benefits tradeoffs for regression testing techniques. In Int'l. Conf. Softw. Maint., pages 204--213, Oct. 2002. Google ScholarDigital Library
- M. Muller and F. Padberg. About the return on investment of test-driven development. In Int'l. W. Econ.-Driven Softw. Eng. Res., May 2003.Google Scholar
- P. Musilek, W. Pedrycz, N. Sun, and G. Succi. On the sensitivity of COCOMO II software cost estimation model. In Int'l. Symp. Softw. Metrics, 2002. Google ScholarDigital Library
- K. Onoma, W.-T. Tsai, M. Poonawala, and H. Suganuma. Regression testing in an industrial environment. Comm. ACM, 41(5):81--86, May 1988. Google ScholarDigital Library
- A. Orso, N. Shi, and M. J. Harrold. Scaling regression testing to large software systems. In Int'l. Symp. Found. Softw. Eng., pages 241--251, Nov. 2004. Google ScholarDigital Library
- T. Ostrand and M. J. Balcer. The category-partition method for specifying and generating functional tests. Comm. ACM, 31(6):676--688, June 1988. Google ScholarDigital Library
- J. J. Phillips. Return on Investment in Training and Performance Improvement Programs. Gulf Publishing Company, Houston, TX, 1997.Google Scholar
- F. L. Ramsey and D. W. Schafer. The Statistical Sleuth. Duxbury Press, 1st edition, 1997.Google Scholar
- G. Rodrigues, D. Rosenblum, and S. Uchitel. Sensitivity analysis for a scenario-based reliability prediction model. In W. Arch. Dep. Sys., 2005. Google ScholarDigital Library
- G. Rothermel, S. Elbaum, P. Kallakuri, X. Qiu, and A. G. Malishevsky. On test suite composition and cost-effective regression testing. ACM Trans. Softw. Eng. Meth., 13(3):277--331, July 2004. Google ScholarDigital Library
- G. Rothermel and M. J. Harrold. Analyzing regression test selection techniques. IEEE Trans. Softw. Eng., 22(8):529--551, Aug. 1996. Google ScholarDigital Library
- G. Rothermel and M. J. Harrold. A safe, efficient regression test selection technique. ACM Trans. Softw. Eng. Meth., 6(2):173--210, Apr. 1997. Google ScholarDigital Library
- A. Saltelli. Sensitivity analysis for important assessment. Risk Analysis, 22(3):579--590, 2002.Google ScholarCross Ref
- A. Saltelli, S. Tarantola, F. Campolongo, and M. Ratto. Sensitivity Analysis in Practice. Wiley, 2004. Google ScholarDigital Library
- F. Shull, V. Basili, B. Boehm, A. W. Brown, P. Costa, M. Lindvall, D. Port, I. Rus, R. Tesoriero, and M. Zelkowitz. What we have learned about fighting defects. In Int'l. Softw. Metrics Symp., pages 249--258, June 2002. Google ScholarDigital Library
- A. Srivastava and J. Thiagarajan. Effectively prioritizing tests in development environment. In Int'l. Symp. Softw. Test. Anal., pages 97--106, July 2002. Google ScholarDigital Library
- T. Ostrand, E. Weyuker, and R. Bell. Predicting the location and number of faults in large software systems. IEEE Trans. Softw. Eng., 31(4):340--355, Apr. 2005. Google ScholarDigital Library
- T. Ostrand, E. Weyuker, and R. Bell. Automating algorithms for the identification of fault-prone files. In Int'l. Symp. Softw. Test. Rel., pages 219--227, July 2007. Google ScholarDigital Library
- S. Wagner. A model and sensitivity analysis of the quality economics of defect-detection techniques. In Int'l. Symp. Softw. Test. Anal., pages 73--84, July 2006. Google ScholarDigital Library
- S. Wagner. An approach to global sensitivity analysis: FAST on COCOMO. In Int'l. Symp. Emp. Softw. Eng. Measurement, pages 440--442, Sept. 2007. Google ScholarDigital Library
- W. Wakeland, R. Martin, and D. Raffo. Using design of experiments, sensitivity analysis, and hybrid simulation to evaluate changes to a software development process: A case study. In Int'l. W. Softw. Process Sim. Model., 2003.Google Scholar
Index Terms
- Using sensitivity analysis to create simplified economic models for regression testing
Recommendations
An empirical study of regression testing techniques incorporating context and lifetime factors and improved cost-benefit models
SIGSOFT '06/FSE-14: Proceedings of the 14th ACM SIGSOFT international symposium on Foundations of software engineeringRegression testing is an important but expensive activity, and a great deal of research on regression testing methodologies has been performed. In recent years, much of this research has emphasized empirical studies, including evaluations of the ...
Regression testing minimization, selection and prioritization: a survey
Regression testing is a testing activity that is performed to provide confidence that changes do not harm the existing behaviour of the software. Test suites tend to grow in size as software evolves, often making it too costly to execute entire test ...
On test suite composition and cost-effective regression testing
Regression testing is an expensive testing process used to revalidate software as it evolves. Various methodologies for improving regression testing processes have been explored, but the cost-effectiveness of these methodologies has been shown to vary ...
Comments