skip to main content
10.1145/3340495.3342752acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

Integrating runtime data with development data to monitor external quality: challenges from practice

Published:26 August 2019Publication History

ABSTRACT

The use of software analytics in software development companies has grown in the last years. Still, there is little support for such companies to obtain integrated insightful and actionable information at the right time. This research aims at exploring the integration of runtime and development data to analyze to what extent external quality is related to internal quality based on real project data. Over the course of more than three months, we collected and analyzed data of a software product following the CRISP-DM process. We studied the integration possibilities between runtime and development data, and implemented two integrations. The number of bugs found in code has a weak positive correlation with code quality measures and a moderate negative correlation with the number of rule violations found. Other types of correlations require more data cleaning and higher quality data for their exploration. During our study, several challenges to exploit data gathered both at runtime and during development were encountered. Lessons learned from integrating external and internal data in software projects may be useful for practitioners and researchers alike.

References

  1. {n. d.}. Q-Rapids source code. https://github.com/q-rapids/qrapids-connectGoogle ScholarGoogle Scholar
  2. Tamer Mohamed Abdellatif, Luiz Fernando Capretz, and Danny Ho. 2015. Software Analytics to Software Practice: A Systematic Literature Review. In Proceedings of the First International Workshop on BIG Data Software Engineering (BIGDSE ’15). IEEE Press, Piscataway, NJ, USA, 30–36. http://dl.acm.org/citation.cfm?id= 2819289.2819300 Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Aytaj Aghabayli. 2019. Software run time data: visualization and integration of development information - case study. Master’s thesis. University of Tartu. https://comserv.cs.ut.ee/ati{_}thesis/datasheet.php?id=66914{&}year=2019Google ScholarGoogle Scholar
  4. Richard Berntsson Svensson, Robert Feldt, and Richard Torkar. 2019. The Unfulfilled Potential of Data-Driven Decision Making in Agile Software Development. 69–85.Google ScholarGoogle Scholar
  5. Cesar Couto, Christofer Silva, Marco Tulio Valente, Roberto Bigonha, and Nicolas Anquetil. 2012. Uncovering Causal Relationships between Software Metrics and Bugs. In 2012 16th European Conference on Software Maintenance and Reengineering. 223–232. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Hennie Huijgens, Davide Spadini, Dick Stevens, Niels Visser, and Arie van Deursen. 2018. Software Analytics in Continuous Delivery: A Case Study on Success Factors. In Proceedings of the 12th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM ’18). ACM, New York, NY, USA, 25:1–25:10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. ISO/IEC 25010:2011. 2011. Software engineering âĂŞ Product quality. https: //www.iso.org/standard/35733.htmlGoogle ScholarGoogle Scholar
  8. Miryung Kim, Thomas Zimmermann, Robert DeLine, and Andrew Begel. 2018. Data Scientists in Software Teams: State of the Art and Challenges. IEEE Transactions on Software Engineering 44, 11 (2018), 1024–1038. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. TSE.2017.2754374Google ScholarGoogle Scholar
  10. Florian Lautenschlager and Marcus Ciolkowski. 2018. Making Runtime Data Useful for Incident Diagnosis: An Experience Report: 19th International Conference, PROFES 2018, Wolfsburg, Germany, November 28âĂŞ30, 2018, Proceedings. 422–430.Google ScholarGoogle Scholar
  11. Lidia López, Silverio Martínez-Fernández, Cristina Gómez, Michał Choraś, Rafał Kozik, Liliana Guzmán, Anna Maria Vollmer, Xavier Franch, and Andreas Jedlitschka. 2018. Q-Rapids Tool Prototype: Supporting Decision-Makers in Managing Quality in Rapid Software Development. Springer, Cham, 200–208.Google ScholarGoogle Scholar
  12. Silverio Martinez-Fernandez, Andreas Jedlitschka, Liliana Guzman, and Anna Maria Vollmer. 2018. A Quality Model for Actionable Analytics in Rapid Software Development. 2018 44th Euromicro Conference on Software Engineering and Advanced Applications (SEAA) 732253 (2018), 370–377.Google ScholarGoogle ScholarCross RefCross Ref
  13. Silverio Martínez-Fernández, Petar Jovanovic, Xavier Franch, and Andreas Jedlitschka. 2018. Towards Automated Data Integration in Software Analytics.Google ScholarGoogle Scholar
  14. Silverio Martínez-Fernández, Anna Maria Vollmer, Andreas Jedlitschka, Xavier Franch, Lidia López, Prabhat Ram, Pilar Rodríguez, Sanja Aaramaa, Alessandra Bagnato, Michał Choraś, and Jari Partanen. 2019. Continuously assessing and improving software quality with software analytics tools: a case study. IEEE Access (2019), 1.Google ScholarGoogle Scholar
  15. Tim Menzies and Martin Shepperd. 2019. âĂIJBad smellsâĂİ in software analytics papers. Information and Software Technology 112 (aug 2019), 35–47.Google ScholarGoogle Scholar
  16. Nachiappan Nagappan, Thomas Ball, and Andreas Zeller. 2006. Mining metrics to predict component failures. Vol. 2006. 452–461 pages. 1134349 Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Hilmer Rodrigues Neri and Guilherme Horta Travassos. 2018. Measuresoftgram: A Future Vision of Software Product Quality. In Proceedings of the 12th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM ’18). ACM, New York, NY, USA, 54:1—-54:4. 3239235.3267438 Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Jan Reimann. 2015. Generic Quality-Aware Refactoring and Co-Refactoring in Heterogeneous Model Environments. Ph.D. Dissertation.Google ScholarGoogle Scholar
  19. Colin Shearer. 2000. The CRISP-DM model: the new blueprint for data mining. Vol. 5. 13–22 pages.Google ScholarGoogle Scholar
  20. Uthayasankar Sivarajah, Muhammad Mustafa Kamal, Zahir Irani, and Vishanth Weerakkody. 2017. Critical analysis of Big Data challenges and analytical methods. Journal of Business Research 70 (jan 2017), 263–286. JBUSRES.2016.08.001Google ScholarGoogle Scholar

Index Terms

  1. Integrating runtime data with development data to monitor external quality: challenges from practice

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        SQUADE 2019: Proceedings of the 2nd ACM SIGSOFT International Workshop on Software Qualities and Their Dependencies
        August 2019
        38 pages
        ISBN:9781450368575
        DOI:10.1145/3340495

        Copyright © 2019 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 26 August 2019

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Upcoming Conference

        ICSE 2025

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader