skip to main content
10.1145/3159450.3159579acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

The Effect of a Web-based Coding Tool with Automatic Feedback on Students' Performance and Perceptions

Published:21 February 2018Publication History

ABSTRACT

In this paper we do three things. First, we describe a web-based coding tool that is open-source, publicly available and provides formative feedback and assessment. Second, we compare several metrics on student performance in courses that use the tool versus courses that do not use it when learning to program in Haskell. We find that the dropout rates are significantly lower in those courses that use the tool at two different universities. Finally we apply the technology acceptance model to analyse students perceptions.

References

  1. Valerie Barr and Deborah Trytten. 2016. Using Turing's craft CodeLab to support CS1 students as they learn to program. Association for Computing Machinery Inroads 7, 2 (2016), 67--75. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Peter Brusilovsky, Stephen Edwards, Amruth Kumar, Lauri Malmi, Luciana Benotti, Duane Buck, Petri Ihantola, Rikki Prince, Teemu Sirkiä, Sergey Sosnovsky, et al. 2014. Increasing Adoption of Smart Learning Content for Computer Science Education. In Proceedings of the Conference on Innovation & Technology in Computer Science Education (ITiCSE-WGR). ACM, 31--57. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. M. Y. Chuttur. 2009. Overview of the Technology Acceptance Model: Origins, Developments and Future Directions. Sprouts: Working Papers on Information Systems 9, 37 (2009).Google ScholarGoogle Scholar
  4. Koen Claessen and John Hughes. 2000. QuickCheck: A Lightweight Tool for Random Testing of Haskell Programs. In Proceedings of the International Conference on Functional Programming (ICFP). 268--279. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Alex Daniel Edgcomb, Frank Vahid, Roman Lysecky, Andre Knoesen, Rajeevan Amirtharajah, and Mary Lou Dorf. 2015. Student performance improvement using interactive textbooks: A three-university cross-semester analysis. American Society for Engineering Education.Google ScholarGoogle Scholar
  6. Alex Gerdes, Bastiaan Heeren, Johan Jeuring, and L Thomas van Binsbergen. 2016. Ask-Elle: an adaptable programming tutor for Haskell giving automated feedback. International Journal of Artificial Intelligence in Education (2016), 1--36.Google ScholarGoogle Scholar
  7. Alex Gerdes, Johan Jeuring, and Bastiaan Heeren. 2012. An interactive functional programming tutor. In Proceedings of the Conference on Innovation & Technology in Computer Science Education (ITiCSE). 250--255. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Petri Ihantola, Arto Vihavainen, Alireza Ahadi, Matthew Butler, Jürgen Börstler, Stephen H. Edwards, Essi Isohanni, et al. 2015. Educational Data Mining and Learning Analytics in Programming. In Proceedings of Innovation & Technology in Computer Science Education Conference (ITiCSE-WGR). ACM, 41--63. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Hieke Keuning, Johan Jeuring, and Bastiaan Heeren. 2016. Towards a Systematic Review of Automated Feedback Generation for Programming Exercises. In Proceedings of the Conference on Innovation & Technology in Computer Science Education (ITiCSE). ACM, 41--46. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Amruth N. Kumar. 2008. The Effect of Using Problem-solving Software Tutors on the Self-confidence of Female Students. Special Interest Group in Computer Science Education (SIGCSE) Bulletin 40, 1 (March 2008), 523--527. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Amruth N. Kumar. 2015. The Effectiveness of Visualization for Learning Expression Evaluation. In Proceedings of the 46th ACM Technical Symposium on Computer Science Education (SIGCSE '15). ACM, New York, NY, USA, 362--367. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Richard Lobb and Jenny Harlow. 2016. Coderunner: A Tool for Assessing Computer Programming Skills. ACM Inroads 7, 1 (Feb. 2016), 47--51. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Nick Parlante. 2017. CodingBat. http://codingbat.com. (2017). {Online; accessed 17-August-2017}.Google ScholarGoogle Scholar
  14. Jaime Spacco, Paul Denny, Brad Richards, David Babcock, David Hovemeyer, James Moscola, and Robert Duvall. 2015. Analyzing Student Work Patterns Using Programming Exercise Data. In Proceedings of the 46th ACM Technical Symposium on Computer Science Education (SIGCSE '15). ACM, New York, NY, USA, 18--23. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Christopher Watson and Frederick W.B. Li. 2014. Failure Rates in Introductory Programming Revisited. In Proceedings of the Conference on Innovation & Technology in Computer Science Education (ITiCSE). 39--44. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. The Effect of a Web-based Coding Tool with Automatic Feedback on Students' Performance and Perceptions

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SIGCSE '18: Proceedings of the 49th ACM Technical Symposium on Computer Science Education
      February 2018
      1174 pages
      ISBN:9781450351034
      DOI:10.1145/3159450

      Copyright © 2018 ACM

      Publication rights licensed to ACM. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 21 February 2018

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      SIGCSE '18 Paper Acceptance Rate161of459submissions,35%Overall Acceptance Rate1,595of4,542submissions,35%

      Upcoming Conference

      SIGCSE Virtual 2024

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader