skip to main content
10.1145/2851613.2851783acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
research-article

Software testing in a scientific research group

Published:04 April 2016Publication History

ABSTRACT

Scientific software is more difficult to test than many other software products, but scientists are not usually trained in software engineering techniques. Considering how often software is used to produce scientific results, how can we be sure the predictions made from these results are correct? Software engineering techniques should be useful for computational scientists. The problem is they find it difficult to know how to apply domain-independent techniques to the specific problems they face in their work. Nevertheless, we have discovered scientists use their own intuition to reinvent techniques surprisingly similar to those in software engineering. This seems like a good place to start our training.

References

  1. E. T. Barr, M. Harman, P. McMinn, M. Shahbaz, and S. Yoo. The Oracle Problem in Software Testing: A Survey. Software Engineering, IEEE Transactions on, 41(5):507--525, 2015.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. I. Burnstein. Practical Software Testing: A Process-Oriented Approach. Springer, New York, NY, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. O. Diekmann, J. Heesterbeek, and J. Metz. On the Definition and the computation of the basic reproduction ratio R0 in models for infectious diseases in heterogeneous populations. Journal of Mathematical Biology, 28(4):365--382, 1990.Google ScholarGoogle ScholarCross RefCross Ref
  4. P. Dubois. Testing scientific programs. Computing in Science & Engineering, 14(4):69--73, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. J. E. Hannay, C. MacLeod, J. Singer, H. P. Langtangen, D. Pfahl, and G. Wilson. How Do Scientists Develop and Use Scientific Software? In Soft. Eng. for Computational Science and Eng., ICSE, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. L. Hatton and A. Roberts. How accurate is scientific software? Software Engineering, IEEE Transactions on, 20(10):785--797, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. S. Hettrick, M. Antonioletti, L. Carr, N. Chue Hong, S. Crouch, D. De Roure, I. Emsley, C. Goble, A. Hay, D. Inupakutika, M. Jackson, A. Nenadic, T. Parkinson, M. I. Parsons, A. Pawlik, G. Peru, A. Proeme, J. Robinson, and S. Sufi. UK Research Software Survey 2014. https://zenodo.org/record/14809.Google ScholarGoogle Scholar
  8. P. C. Jorgensen. Software Testing: A Craftsman's Approach. CRC Press, Boca Raton, FL, 4 edition, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  9. R. C. Martin. Clean Code: A Handbook of Agile Software Craftsmanship. Prentice Hall, Upper Saddle River, NJ, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. R. Meentemeyer, N. Cunniffe, A. Cook, J. Filipe, R. Hunter, D. Rizzo, and C. Gilligan. Epidemiological Modeling of Invasion in Heterogeneous Landscapes: Spread of Sudden Oak Death in California (1990-2030). Ecosphere, 2(2), 2011.Google ScholarGoogle Scholar
  11. Z. Merali. Computational science: Error, why scientific programming does not compute. Nature, 467(7317), 2010.Google ScholarGoogle Scholar
  12. M. Renton. Shifting focus from the population to the individual as a way forward in understanding, predicting and managing the complexities of evolution of resistance to pesticides. Pest Management Science, 69(2):171--175, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  13. Software Carpentry. Why We Don't Teach Testing (Even Though We'd Like To). Accessed 02/12/2015 from: http://software-carpentry.org/blog/2014/10/why-we-dont-teach-testing.html.Google ScholarGoogle Scholar

Index Terms

  1. Software testing in a scientific research group

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SAC '16: Proceedings of the 31st Annual ACM Symposium on Applied Computing
      April 2016
      2360 pages
      ISBN:9781450337397
      DOI:10.1145/2851613

      Copyright © 2016 ACM

      © 2016 Association for Computing Machinery. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 4 April 2016

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      SAC '16 Paper Acceptance Rate252of1,047submissions,24%Overall Acceptance Rate1,650of6,669submissions,25%
    • Article Metrics

      • Downloads (Last 12 months)4
      • Downloads (Last 6 weeks)1

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader