skip to main content
10.1145/2676723.2677300acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

Checked Coverage and Object Branch Coverage: New Alternatives for Assessing Student-Written Tests

Published:24 February 2015Publication History

ABSTRACT

Many educators currently use code coverage metrics to assess student-written software tests. While test adequacy criteria such as statement or branch coverage can also be used to measure the thoroughness of a test suite, they have limitations. Coverage metrics assess what percentage of code has been exercised, but do not depend on whether a test suite adequately checks that the expected behavior is achieved. This paper evaluates checked coverage, an alternative measure of test thoroughness aimed at overcoming this limitation, along with object branch coverage, a structure code coverage metric that has received little discussion in educational assessment. Checked coverage works backwards from behavioral assertions in test cases, measuring the dynamic slice of the executed code that actually influences the outcome of each assertion. Object branch coverage (OBC) is a stronger coverage criterion similar to weak variants of modified condition/decision coverage. We experimentally compare checked coverage and OBC against statement coverage, branch coverage, mutation analysis, and all-pairs testing to evaluate which is the best predictor of how likely a test suite is to detect naturally occurring defects. While checked coverage outperformed other coverage measures in our experiment, followed closely by OBC, both were only weakly correlated with a test suite's ability to detect naturally occurring defects produced by students in the final versions of their programs. Still, OBC appears to be an improved and practical alternative to existing statement and branch coverage measures, while achieving nearly the same benefits as checked coverage.

References

  1. K. Aaltonen, P. Ihantola, and O. Seppälä, "Mutation analysis vs. code coverage in automated assessment of students' testing skills," presented at the Proceedings of the ACM international conference companion on Object oriented programming systems languages and applications companion, Reno/Tahoe, Nevada, USA, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. M. Bordin, C. Comar, T. Gingold, J. Guitton, O. Hainque, and T. Quinot. Object and Source Coverage for Critical Applications with the COUVERTURE Open Analysis Framework. In ERTS (Embedded Real Time Sofware and Systems Conference), May 2010.Google ScholarGoogle Scholar
  3. J. J. Chilenski, "An Investigation of Three Forms of the Modified Condition Decision Coverage (MCDC) Criterion" U.S. Department of Transportation DOT/FAA/AR-01/18, 2001.Google ScholarGoogle Scholar
  4. C. Comar, J. Guitton, O. Hainque, T. Quinot. Formalization and Comparison of MCDC and Object Branch Coverage Criteria. In ERTS (Embedded Real Time Software and Systems Conference), May 2012.Google ScholarGoogle Scholar
  5. (9/5/2014). Clover: Java and Groovy Code Coverage. Available: https://www.atlassian.com/software/clover/overviewGoogle ScholarGoogle Scholar
  6. S. H. Edwards and Z. Shams, "Comparing Test Quality Measures for Assessing Student-Written Tests," in 36th International Conference on Software Engineering, Hyderabad, India, 2014, p. to apprear. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. S. H. Edwards, Z. Shams, M. Cogswell, and R. C. Senkbeil, "Running students' software tests against each others' code: new life for an old "gimmick"," presented at the Proceedings of the 43rd ACM technical symposium on Computer Science Education, Raleigh, North Carolina, USA, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. (9/5/2014). JaCoCo Java Code Coverage Library. Available: http://www.eclemma.org/jacoco/Google ScholarGoogle Scholar
  9. (9/5/2014). JavaSlicer. Available: http://www.st.cs.uni-saarland.de/javaslicer/Google ScholarGoogle Scholar
  10. D. Schuler and A. Zeller, "Assessing Oracle Quality with Checked Coverage," presented at the Proceedings of the 2011 Fourth IEEE International Conference on Software Testing, Verification and Validation, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Z. Shams and S. H. Edwards, "Toward practical mutation analysis for evaluating the quality of student-written software tests," presented at the Proceedings of the ninth annual international ACM conference on International computing education research, San Diego, San California, USA, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. M. D. Weiser, "Program slices: formal, psychological, and practical investigations of an automatic program abstraction method," University of Michigan, 1979.Google ScholarGoogle Scholar
  13. H. Zhu, P. A. V. Hall, and J. H. R. May, "Software Unit Test Coverage and Adequacy", ACM Computing Surveys, 29(4): 366--427, December 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Checked Coverage and Object Branch Coverage: New Alternatives for Assessing Student-Written Tests

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader