skip to main content
10.1145/1595356.1595378acmotherconferencesArticle/Chapter ViewAbstractPublication Pageskoli-callingConference Proceedingsconference-collections
research-article

PeerWise

Published: 13 November 2008 Publication History

Abstract

PeerWise is a web-based system that allows multiple-choice question banks to be built solely from student input. The system provides a number of intrinsic reward structures that encourage students to contribute high-quality questions in the complete absence of instructor moderation. Several opportunities for learning arise, spanning the range from simple drill-and-practice exercises to deep, reflective study. Affective skills are also developed, as students are challenged to give and receive critical feedback and provide quality judgements.
The system is freely available, and has been used in a range of disciplines in two Universities.

References

[1]
N. Arthur. Using student-generated assessment items to enhance teamwork, feedback and the learning process. Synergy, 24:21--23, Nov. 2006. www.itl.usyd.edu.au/synergy.
[2]
M. Barak and S. Rafaeli. On-line question-posing and peer-assessment as means for web-based knowledge sharing in learning. International Journal of Human-Computer Studies, 61:84--103, 2004.
[3]
M. Birenbaum. Assessment 2000: toward a pluralistic approach to assessment. In M. Birenbaum and F. Dochy, editors, Alternatives in Assessment of Achievement, Learning Processes and Prior Knowledge, pages 3--31, Boston, MA., 1996. Kluwer Academic.
[4]
B. Collis. The contributing student: A blend of pedagogy and technology. In EDUCAUSE Australasia, Auckland, New Zealand, Apr. 2005.
[5]
P. Denny, J. Hamer, A. Luxton-Reilly, and H. Purchase. Peerwise: students sharing their multiple choice questions. In ICER'08: Proceedings of the 2008 International Workshop on Computing Education Research, Sydney, Australia, Sept. 2008.
[6]
P. Denny, A. Luxton-Reilly, and J. Hamer. The PeerWise system of student contributed assessment questions. In Simon and M. Hamilton, editors, Tenth Australasian Computing Education Conference (ACE 2008), volume 78 of CRPIT, pages 69--74, Wollongong, NSW, Australia, 2008. ACS.
[7]
P. Denny, A. Luxton-Reilly, and B. Simon. Quality of student contributed questions using peerwise. In M. Hamilton and T. Clear, editors, ACE'09: Proceedings of the Eleventh Australasian Computing Education Conference (ACE2009), CRPIT, Wellington, New Zealand, Jan. 2009. ACS. (submitted for publication).
[8]
M. Fellenz. Using assessment to support higher level learning: the multiple choice item development assignment. Assessment and Evaluation in Higher Education, 29(6):703--719, 2004.
[9]
S. Horgen. Pedagogical use of multiple choice tests - students create their own tests. In P. Kefalas, A. Sotiriadou, G. Davies, and A. McGettrick, editors, Proceedings of the Informatics Education Europe II Conference. SEERC, 2007.
[10]
F.-Y. Yu, Y.-H. Liu, and T.-W. Chan. A web-based learning system for question posing and peer assessment. Innovations in Education and Teaching International, 42(4):337--348, Nov. 2005.

Cited By

View all
  • (2024)Multimodal prediction of student performance: A fusion of signed graph neural networks and large language modelsPattern Recognition Letters10.1016/j.patrec.2024.03.007181(1-8)Online publication date: May-2024
  • (2022)IMPACT OF USING STUDENT GENERATED MULTIPLE CHOICE QUESTIONS IN LEARNING PHYSIOLOGYINTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH10.36106/ijsr/6000207(3-5)Online publication date: 1-Jul-2022
  • (2021)Sustainable Approaches for Accelerated LearningSustainability10.3390/su13211199413:21(11994)Online publication date: 29-Oct-2021
  • Show More Cited By

Index Terms

  1. PeerWise

    Recommendations

    Reviews

    Barrett Hazeltine

    Denny et al. describe a Web-based system that creates a bank of multiple-choice questions written by students. Students learn by composing questions and by answering others' questions. When creating a question, the student must specify why one alternative is correct and the others are not. Other students can judge whether the proposed answer is correct and post their own judgments. They can also judge and post the level of difficulty and quality of the question. The system has been used in a variety of courses at two universities and is accepted by students, especially as a tool for review before examinations. Significant correlation was found between extensive use of the system and performance on final examinations, presumably because the students who are writing and answering questions are engaged with the course material. A study of the questions written and answered shows that the questions are of high quality and that students made good judgments about that quality. The software is intended to be simple to use, and requires little instructor involvement. It may be useful software, but unfortunately the paper does not provide information on how to obtain it. Online Computing Reviews Service

    Access critical reviews of Computing literature here

    Become a reviewer for Computing Reviews.

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    Koli '08: Proceedings of the 8th International Conference on Computing Education Research
    November 2008
    127 pages
    ISBN:9781605583853
    DOI:10.1145/1595356
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 13 November 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. MCQ
    2. PeerWise
    3. automated
    4. contributing student
    5. peer assessment
    6. question test bank

    Qualifiers

    • Research-article

    Conference

    Koli '08

    Acceptance Rates

    Overall Acceptance Rate 80 of 182 submissions, 44%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)18
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 14 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Multimodal prediction of student performance: A fusion of signed graph neural networks and large language modelsPattern Recognition Letters10.1016/j.patrec.2024.03.007181(1-8)Online publication date: May-2024
    • (2022)IMPACT OF USING STUDENT GENERATED MULTIPLE CHOICE QUESTIONS IN LEARNING PHYSIOLOGYINTERNATIONAL JOURNAL OF SCIENTIFIC RESEARCH10.36106/ijsr/6000207(3-5)Online publication date: 1-Jul-2022
    • (2021)Sustainable Approaches for Accelerated LearningSustainability10.3390/su13211199413:21(11994)Online publication date: 29-Oct-2021
    • (2020)QUALITY AND FEATURE OF MULTIPLE-CHOICE QUESTIONS IN EDUCATIONProblems of Education in the 21st Century10.33225/pec/20.78.57678:4(576-594)Online publication date: 5-Aug-2020
    • (2020)Ten simple rules for supporting a temporary online pivot in higher educationPLOS Computational Biology10.1371/journal.pcbi.100824216:10(e1008242)Online publication date: 1-Oct-2020
    • (2020)Asynchronous Assistance: A Social Network Analysis of Influencing Peer Interactions in PeerWiseEuropean Journal of Mathematics and Science Education10.12973/ejmse.1.1.43volume-1-2020:volume-1-issue-1-june-2020(43-52)Online publication date: 15-Jun-2020
    • (2020)Integrating supercomputing clusters into education: a case study in biotechnologyThe Journal of Supercomputing10.1007/s11227-020-03360-5Online publication date: 9-Jun-2020
    • (2020)Modelling Learners in Crowdsourcing Educational SystemsArtificial Intelligence in Education10.1007/978-3-030-52240-7_1(3-9)Online publication date: 30-Jun-2020
    • (2019)The Relationship Between Voluntary Practice of Short Programming Exercises and Exam PerformanceProceedings of the ACM Conference on Global Computing Education10.1145/3300115.3309525(113-119)Online publication date: 9-May-2019
    • (2018)Empirical Support for a Causal Relationship Between Gamification and Learning OutcomesProceedings of the 2018 CHI Conference on Human Factors in Computing Systems10.1145/3173574.3173885(1-13)Online publication date: 21-Apr-2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media