skip to main content
10.1145/1734263.1734411acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

Analyzing test items: using item response theory to validate assessments

Published: 10 March 2010 Publication History

Abstract

As professional educators we produce a large number of assessments for our students to complete. These assessments or exams are often evaluated informally based upon student feedback and simple measures such as average score on a question. This paper highlights another more rigorous approach to item evaluation, and presents an evaluation of several items from an assessment as examples of the type of information that Item Response Theory can provide.

References

[1]
Arne Duncan. "Keynote address for ies research conference", June 2009, http://www.ed.gov/news/speeches/2009/06/06082009.html.
[2]
Titus Winters and Tom Payne, "What do students know?: an outcomes-based assessment system", in ICER '05: Proceedings of the first international workshop on Computing education research, New York, NY, USA, 2005, pp. 165--172, ACM.
[3]
Frank B. Baker, The Basics of Item Response Theory, Heinemann, 1985.
[4]
Moodle content management system", Website, www.moodle.org.
[5]
Pittsburgh Science of Learning Center, "Pslc datashop", Website, http://www.learnlab.org/technologies/datashop/.
[6]
Gwen Nugent, Leen-Kiat Soh, Ashok Samal, Suzette Person, and Jeff Lang, "Design, development and validation of a learning object for cs1", in ITiCSE '05: Proceedings of the 10th annual SIGCSE conference on Innovation and technology in computer science education, New York, NY USA, 2005, pp. 370--370, ACM.
[7]
Titus Winters and Tom Payne, "Closing the loop on test creation: a question assessment mechanism for instructors", in SIGCSE '06: Proceedings of the 37th SIGCSE technical symposium on Computer science education, New York, NY, USA, 2006, pp. 169--170, ACM.
[8]
Deborah Harris, "Comparison of 1-, 2-, and 3-Parameter IRT Models", Items: Instructional Topics in Educational Measurement, pp. 157--163, 1989.

Cited By

View all
  • (2022)Identifying Difficult exercises in an eTextbook Using Item Response Theory and Logged Data Analysis2022 10th International Japan-Africa Conference on Electronics, Communications, and Computations (JAC-ECC)10.1109/JAC-ECC56395.2022.10043955(258-263)Online publication date: 19-Dec-2022
  • (2020)Test and Item Response Theories and School Environment as Assessment Practice Factors among Science and Mathematics Teachers in Secondary Schools in Eastern UgandaEAST AFRICAN JOURNAL OF EDUCATION AND SOCIAL SCIENCES10.46606/eajess2020v01i03.00451:3(77-86)Online publication date: 26-Dec-2020
  • (2020)Development and Validation of the Middle Grades Computer Science Concept Inventory (MG-CSCI) AssessmentEURASIA Journal of Mathematics, Science and Technology Education10.29333/ejmste/11660016:5Online publication date: 2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGCSE '10: Proceedings of the 41st ACM technical symposium on Computer science education
March 2010
618 pages
ISBN:9781450300063
DOI:10.1145/1734263
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 10 March 2010

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. assessment
  2. item response theory

Qualifiers

  • Research-article

Conference

SIGCSE10
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,787 of 5,146 submissions, 35%

Upcoming Conference

SIGCSE TS 2025
The 56th ACM Technical Symposium on Computer Science Education
February 26 - March 1, 2025
Pittsburgh , PA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)4
  • Downloads (Last 6 weeks)1
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2022)Identifying Difficult exercises in an eTextbook Using Item Response Theory and Logged Data Analysis2022 10th International Japan-Africa Conference on Electronics, Communications, and Computations (JAC-ECC)10.1109/JAC-ECC56395.2022.10043955(258-263)Online publication date: 19-Dec-2022
  • (2020)Test and Item Response Theories and School Environment as Assessment Practice Factors among Science and Mathematics Teachers in Secondary Schools in Eastern UgandaEAST AFRICAN JOURNAL OF EDUCATION AND SOCIAL SCIENCES10.46606/eajess2020v01i03.00451:3(77-86)Online publication date: 26-Dec-2020
  • (2020)Development and Validation of the Middle Grades Computer Science Concept Inventory (MG-CSCI) AssessmentEURASIA Journal of Mathematics, Science and Technology Education10.29333/ejmste/11660016:5Online publication date: 2020
  • (2019)cpm.4.CSE/IRTEducation and Information Technologies10.1007/s10639-018-9794-324:1(843-884)Online publication date: 1-Jan-2019
  • (2018)Sử dụng phần mềm IATA để phân tích, đánh giá và nâng cao chất lượng câu hỏi trắc nghiệm khách quan trong chương hàm số lũy thừa, hàm số mũ, hàm số lôgaritCan Tho University, Journal of Science10.22144/ctu.jvn.2018.16454(9)(81)Online publication date: 2018
  • (2018)Predicting Assessment Item Difficulty Levels Using a Gaussian Mixture Model2018 International Conference on Data Science and Engineering (ICDSE)10.1109/ICDSE.2018.8527800(1-6)Online publication date: Aug-2018
  • (2018)Automatic Assessment Item Bank Calibration for Learning Gap Identification2018 International Conference on Advances in Computing, Communications and Informatics (ICACCI)10.1109/ICACCI.2018.8554481(1429-1435)Online publication date: Sep-2018
  • (2015)Design and First Results of a Psychometric Test for Measuring Basic Programming AbilitiesProceedings of the Workshop in Primary and Secondary Computing Education10.1145/2818314.2818320(2-10)Online publication date: 9-Nov-2015
  • (2015)Using Commutative Assessments to Compare Conceptual Understanding in Blocks-based and Text-based ProgramsProceedings of the eleventh annual International Conference on International Computing Education Research10.1145/2787622.2787721(101-110)Online publication date: 9-Jul-2015
  • (2015)A Practical Guide to Developing and Validating Computer Science Knowledge Assessments with Application to Middle SchoolProceedings of the 46th ACM Technical Symposium on Computer Science Education10.1145/2676723.2677295(622-627)Online publication date: 24-Feb-2015

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media