skip to main content
10.1145/2676723.2677295acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

A Practical Guide to Developing and Validating Computer Science Knowledge Assessments with Application to Middle School

Published: 24 February 2015 Publication History

Abstract

Knowledge assessment instruments, or tests, are commonly created by faculty in classroom settings to measure student knowledge and skill. Another crucial role for assessment instruments is in gauging student learning in response to a computer science education research project, or intervention. In an increasingly interdisciplinary landscape, it is crucial to validate knowledge assessment instruments, yet developing and validating these tests for computer science poses substantial challenges. This paper presents a seven-step approach to designing, iteratively refining, and validating knowledge assessment instruments designed not to assign grades but to measure the efficacy or promise of novel interventions. We also detail how this seven-step process is being instantiated within a three-year project to implement a game-based learning environment for middle school computer science. This paper serves as a practical guide for adapting widely accepted psychometric practices to the development and validation of computer science knowledge assessments to support research.

References

[1]
AP® Computer Science Principles Draft Curriculum Framework: 2014. http://www.csprinciples.org/. Accessed: 2014-09-05.
[2]
Ayala, R. J. De 2008. The Theory and Practice of Item Response Theory.
[3]
Binning, J. F. and Gerald V. Barrett 1989. Validity of Personal Decisions: A Conceptual Analysis of the Inferential and Evidential Bases. Journal of Applied Psychology. 74, 3 (1989), 478--494.
[4]
Buffum, P. S. et al. 2014. CS Principles Goes to Middle School: Learning How to Teach "Big Data." SIGCSE '14 (2014), 151--156.
[5]
Cronbach, L. J. 1951. Coefficient Alpha and the Internal Structure of Tests. Psychometrika. 16, (1951), 297--334.
[6]
Eli P. Cox, III. 1980. The Optimal Number of Response Alternatives for a Scale: A Review. Journal of Marketing Research. 17, (1980), 407--422.
[7]
Franklin, D. et al. 2013. Assessment of Computer Science Learning in a Scratch-Based Outreach Program. SIGCSE '13 (2013), 371--376.
[8]
Gwet, K. L. 2012. Handbook of Inter-Rater Reliability. Advanced Analytics, LLC.
[9]
Hinkin, T. R. 1998. A Brief Tutorial on the Development of Measures for Use in Survey Questionnaires. Organizational Research Methods.
[10]
National Governors Association Center for Best Practices 2010. Common Core State Standards.
[11]
Roediger, H. L. and Karpicke, J. D. 2006. The Power of Testing Memory Basic Research and Implications for Educational Practice. Perspectives on Psychological Science. 1, (2006), 181--210.
[12]
Shuhidan, S. et al. 2010. Instructor Perspectives of Multiple-Choice Questions in Summative Sssessment for Novice Programmers. Computer Science Education. 20, 3 (2010), 229--259.
[13]
Sudol, L. A. and Studer, C. 2010. Analyzing Test Items: Using Item Response Theory to Validate Assessments. SIGCSE '10 (2010), 436--440.
[14]
Tew, A. E. and Guzdial, M. 2010. Developing a Validated Assessment of Fundamental CS1 Concepts. SIGCSE '10 (2010), 97--101.
[15]
Tew, A. E. and Guzdial, M. 2011. The FCS1: A Language Independent Assessment of CS1 Knowledge. SIGCSE '11 (2011), 111--116.
[16]
Vasilevskaya, M. et al. 2014. An Assessment Model for Large Project Courses. SIGCSE '14 (2014), 253--258.
[17]
Werner, L. et al. 2012. The Fairy Performance Assessment: Measuring Computational Thinking in Middle School. SIGCSE '12 (2012), 7--12.

Cited By

View all
  • (2022)The Use of Cognitive Diagnostic Modeling in the Assessment of Computational ThinkingAERA Open10.1177/233285842210812568Online publication date: 18-Mar-2022
  • (2022)Comparing estimates of difficulty of programming constructsProceedings of the 22nd Koli Calling International Conference on Computing Education Research10.1145/3564721.3565950(1-12)Online publication date: 17-Nov-2022
  • (2022)Learning Computational Thinking EfficientlyProceedings of the 24th Australasian Computing Education Conference10.1145/3511861.3511869(66-75)Online publication date: 14-Feb-2022
  • Show More Cited By

Index Terms

  1. A Practical Guide to Developing and Validating Computer Science Knowledge Assessments with Application to Middle School

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGCSE '15: Proceedings of the 46th ACM Technical Symposium on Computer Science Education
    February 2015
    766 pages
    ISBN:9781450329668
    DOI:10.1145/2676723
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 24 February 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. assessment
    2. computer science education
    3. middle school

    Qualifiers

    • Research-article

    Funding Sources

    • NSF Division of Computer and Network Systems

    Conference

    SIGCSE '15
    Sponsor:

    Acceptance Rates

    SIGCSE '15 Paper Acceptance Rate 105 of 289 submissions, 36%;
    Overall Acceptance Rate 1,787 of 5,146 submissions, 35%

    Upcoming Conference

    SIGCSE TS 2025
    The 56th ACM Technical Symposium on Computer Science Education
    February 26 - March 1, 2025
    Pittsburgh , PA , USA

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)28
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 17 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2022)The Use of Cognitive Diagnostic Modeling in the Assessment of Computational ThinkingAERA Open10.1177/233285842210812568Online publication date: 18-Mar-2022
    • (2022)Comparing estimates of difficulty of programming constructsProceedings of the 22nd Koli Calling International Conference on Computing Education Research10.1145/3564721.3565950(1-12)Online publication date: 17-Nov-2022
    • (2022)Learning Computational Thinking EfficientlyProceedings of the 24th Australasian Computing Education Conference10.1145/3511861.3511869(66-75)Online publication date: 14-Feb-2022
    • (2022)Teaching of the Yupana with the Tawa Pukllay method for developing the Computational Thinking in children2022 IEEE World Engineering Education Conference (EDUNINE)10.1109/EDUNINE53672.2022.9782386(1-5)Online publication date: 13-Mar-2022
    • (2022)A validity and reliability study of the formative model for the indicators of STEAM education creationsEducation and Information Technologies10.1007/s10639-022-11412-x28:7(8855-8878)Online publication date: 29-Dec-2022
    • (2022)How to Assess Student Learning in Information Science: Exploratory Evidence from Large College CoursesProceedings of the Association for Information Science and Technology10.1002/pra2.65959:1(500-504)Online publication date: 14-Oct-2022
    • (2021)How do students develop computational thinking? Assessing early programmers in a maze-based online gameComputer Science Education10.1080/08993408.2021.190324831:2(259-289)Online publication date: 13-Apr-2021
    • (2020)Development and Validation of the Middle Grades Computer Science Concept Inventory (MG-CSCI) AssessmentEURASIA Journal of Mathematics, Science and Technology Education10.29333/ejmste/11660016:5Online publication date: 2020
    • (2020)Helping teachers make equitable decisions: effects of the TEC Rubric on teachers’ evaluations of a computing curriculumComputer Science Education10.1080/08993408.2020.178886231:3(400-429)Online publication date: 20-Jul-2020
    • (2020)Development and Validation of Scientific Practices Assessment Tasks for the General Chemistry LaboratoryJournal of Chemical Education10.1021/acs.jchemed.9b0089797:4(884-893)Online publication date: 14-Mar-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media