skip to main content
10.1145/2931037.2931064acmconferencesArticle/Chapter ViewAbstractPublication PagesisstaConference Proceedingsconference-collections
research-article

Verdict machinery: on the need to automatically make sense of test results

Published: 18 July 2016 Publication History

Abstract

Along with technological developments and increasing competition there is a major incentive for companies to produce and market high quality products before their competitors. In order to conquer a bigger portion of the market share, companies have to ensure the quality of the product in a shorter time frame. To accomplish this task companies try to automate their test processes as much as possible. It is critical to investigate and understand the problems that occur during different stages of test automation processes. In this paper we report on a case study on automatic analysis of non-functional test results. We discuss challenges in the face of continuous integration and deployment and provide improvement suggestions based on interviews at a large company in Sweden. The key contributions of this work are filling the knowledge gap in research about performance regression test analysis automation and providing warning signs and a road map for the industry.

References

[1]
E. Barr, M. Harman, P. McMinn, M. Shahbaz, and S. Yoo. The oracle problem in software testing: A survey. Software Engineering, IEEE Transactions on, 41(5):507–525, 2015.
[2]
A. Bertolino, P. Inverardi, P. Pelliccione, and M. Tivoli. Automatic synthesis of behavior protocols for composable web-services. In Proceedings of the the 7th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on The foundations of software engineering, pages 141–150. ACM, 2009.
[3]
E. Bjarnason, P. Runeson, M. Borg, M. Unterkalmsteiner, E. Engström, B. Regnell, G. Sabaliauskaite, A. Loconsole, T. Gorschek, and R. Feldt. Challenges and practices in aligning requirements with verification and validation: a case study of six companies. Empirical Software Engineering, 19(6):1809–1855, 2014.
[4]
J. W. Creswell. Research design: Qualitative, quantitative, and mixed methods approaches. Sage publications, 2013.
[5]
G. Denaro, A. Polini, and W. Emmerich. Early performance testing of distributed software applications. In ACM SIGSOFT Software Engineering Notes, volume 29, pages 94–103. ACM, 2004.
[6]
P. M. Duvall, S. Matyas, and A. Glover. Continuous integration: improving software quality and reducing risk. Pearson Education, 2007.
[7]
E. Engström, R. Feldt, and R. Torkar. Indirect effects in evidential assessment: a case study on regression test technology adoption. In Proc. of the 2nd Int. WS on Evidential Assessment of Software Technologies, page 15–20, 2012.
[8]
Ericsson.com. Evolved packet gateway, 2015.
[9]
Ericsson.com. Traffic and market report, 2015.
[10]
M. Felderer and I. Schieferdecker. A taxonomy of risk-based testing. Int J Softw Tools Technol Transfer, 16:559–568.
[11]
K. C. Foo, Z. M. J. Jiang, B. Adams, A. E. Hassan, Y. Zou, and P. Flora. An industrial case study on the automated detection of performance regressions in heterogeneous environments. In Proceedings of the 37th International Conference on Software Engineering - Volume 2, ICSE ’15, pages 159–168, 2015.
[12]
R. M. Hierons. Verdict functions in testing with a fault domain or test hypotheses. ACM Transactions on Software Engineering and Methodology (TOSEM), 18(4):14, 2009.
[13]
A. Ieshin, M. Gerenko, and V. Dmitriev. Test automation: Flexible way. In Software Engineering Conference in Russia (CEE-SECR), 2009 5th Central and Eastern European, pages 249–252. IEEE, 2009.
[14]
E. Knauss, M. Staron, W. Meding, O. Söder, A. Nilsson, and M. Castell. Supporting continuous integration by code-churn based test selection. In Proceedings of 2nd International Workshop on Rapid and Continuous Software Engeering (RCoSE ’15 @ ICSE), Florenz, Italy, 2015.
[15]
J. Larsson and M. Borg. Revisiting the challenges in aligning re and v&v: Experiences from the public sector. In Requirements Engineering and Testing (RET), 2014 IEEE 1st International Workshop on, pages 4–11. IEEE, 2014.
[16]
D. L. Morgan. Qualitative content analysis: A guide to paths not taken. Qualitative Health Research, 3(1):112–121, 1993.
[17]
M. Polo, P. Reales, M. Piattini, and C. Ebert. Test automation. IEEE software, (1):84–89, 2013.
[18]
D. Stahl and J. Bosch. Modelling continuous integration practice differences in industry software development. Systems and Software, 87:48–59, 2014.
[19]
E. J. Weyuker and F. I. Vokolos. Experience with performance testing of software systems: issues, an approach, and case study. IEEE transactions on software engineering, (12):1147–1156, 2000.

Cited By

View all
  • (2025)Performance regression testing initiativesInformation and Software Technology10.1016/j.infsof.2024.107641179:COnline publication date: 1-Mar-2025
  • (2024)AI-driven Java Performance Testing: Balancing Result Quality with Testing TimeProceedings of the 39th IEEE/ACM International Conference on Automated Software Engineering10.1145/3691620.3695017(443-454)Online publication date: 27-Oct-2024
  • (2023)Ebola optimization search algorithm for the enhancement of devops and cycle time reductionInternational Journal of Information Technology10.1007/s41870-023-01217-715:3(1309-1317)Online publication date: 9-Mar-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ISSTA 2016: Proceedings of the 25th International Symposium on Software Testing and Analysis
July 2016
452 pages
ISBN:9781450343909
DOI:10.1145/2931037
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 July 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Automation}
  2. Non-Functional Testing Oracle
  3. Performance regression test analysis
  4. Verdict System

Qualifiers

  • Research-article

Conference

ISSTA '16
Sponsor:

Acceptance Rates

Overall Acceptance Rate 58 of 213 submissions, 27%

Upcoming Conference

ISSTA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)0
Reflects downloads up to 02 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Performance regression testing initiativesInformation and Software Technology10.1016/j.infsof.2024.107641179:COnline publication date: 1-Mar-2025
  • (2024)AI-driven Java Performance Testing: Balancing Result Quality with Testing TimeProceedings of the 39th IEEE/ACM International Conference on Automated Software Engineering10.1145/3691620.3695017(443-454)Online publication date: 27-Oct-2024
  • (2023)Ebola optimization search algorithm for the enhancement of devops and cycle time reductionInternational Journal of Information Technology10.1007/s41870-023-01217-715:3(1309-1317)Online publication date: 9-Mar-2023
  • (2021)Constructive Master’s Thesis Work in Industry: Guidelines for Applying Design Science Research2021 IEEE/ACM 43rd International Conference on Software Engineering: Software Engineering Education and Training (ICSE-SEET)10.1109/ICSE-SEET52601.2021.00021(110-121)Online publication date: May-2021
  • (2018)An evaluation of open-source software microbenchmark suites for continuous performance assessmentProceedings of the 15th International Conference on Mining Software Repositories10.1145/3196398.3196407(119-130)Online publication date: 28-May-2018
  • (2018)Automation of regression test in microservice architecture2018 4th International Conference on Web Research (ICWR)10.1109/ICWR.2018.8387249(133-137)Online publication date: Apr-2018

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media