ABSTRACT
Recent research has focused on tools that support the creation, review and sharing of student-generated content for peer learning. However, we know little about the student perspective of such activities. In this paper, we identify what students believe is most helpful for their learning by analysing open-ended comments from students engaged in creating, answering and reviewing exam-style questions generated by their peers. Students report learning about content and appropriate standards of work, both individually and through interaction with peer generated resources.
- R. Ballantyne, K. Hughes, and A. Mylonas. Developing procedures for implementing peer assessment in large classes using an action research process. Assessment and Evaluation in Higher Education, 27(5):427--441, 2002.Google ScholarCross Ref
- A. Bandura. Social foundations of thought and action: A social cognitive theory. Prentice-Hall, 1986.Google Scholar
- D. Boud. Sustainable assessment: rethinking assessment for the learning society. Studies in Continuing Education, 22(2):151--167, 2000.Google ScholarCross Ref
- B. Collis and J. Moonen. An on-going journey: Technology as a learning workbench. University of Twente, Enschede, The Netherlands., 2005.Google Scholar
- P. Denny, A. Luxton-Reilly, and J. Hamer. The PeerWise system of student contributed assessment questions. In Simon and M. Hamilton, editors, Tenth Australasian Computing Education Conference (ACE 2008), volume 78 of CRPIT, pages 69--74, Wollongong, NSW, Australia, 2008. ACS. Google ScholarDigital Library
- P. Denny, A. Luxton-Reilly, E. Tempero, and J. Hendrickx. Codewrite: supporting student-driven practice of java. In Proceedings of the 42nd ACM technical symposium on Computer science education, SIGCSE '11, pages 471--476, New York, NY, USA, 2011. ACM. Google ScholarDigital Library
- S. W. Draper. Catalytic assessment: understanding how MCQs and EVS can foster deep learning. British Journal of Educational Technology, 40(2):285--293, March 2009.Google ScholarCross Ref
- J. Hamer, Q. Cutts, J. Jackova, A. Luxton-Reilly, R. McCartney, H. Purchase, C. Riedesel, M. Saeli, K. Sanders, and J. Sheard. Contributing student pedagogy. SIGCSE Bull., 40(4):194--212, 2008. Google ScholarDigital Library
- J. Hamer, H. C. Purchase, A. Luxton-Reilly, and J. Sheard. Tools for "contributing student learning". In Proceedings of the 2010 ITiCSE working group reports on Working group reports, ITiCSE-WGR '10, pages 1--14, New York, NY, USA, 2010. ACM. Google ScholarDigital Library
- S. J. Hanrahan and G. Isaacs. Assessing stlf- and peer-assessment: the students' views. Higher Education Research & Development, 20:53--69, 2001.Google ScholarCross Ref
- Y. Hirai and A. Hazeyama. A learning support system based on question-posing and its evaluation. Creating, Connecting and Collaborating through Computing, International Conference on, 0:178--184, 2007. Google ScholarDigital Library
- C. Howe, D. McWilliam, and G. Cross. Chance favours only the prepared mind: Incubation and the delayed effects of peer collaboration. British Journal of Psychology, 96(1):67--93(27), February 2005.Google ScholarCross Ref
- Y.-F. Lan and P. C. Lin. Evaluation and improvement of student's question posing ability in a web-based learning environment. Australasian Journal of Educational Technology, 27(4):581--599, 2011.Google ScholarCross Ref
- J. R. Landis and G. G. Koch. The measurement of observer agreement for categorical data. Biometrics, 33(1):pp. 159--174, 1977.Google ScholarCross Ref
- A. Luxton-Reilly, D. Bertinshaw, P. Denny, B. Plimmer, and R. Sheehan. The impact of question generation activities on performance. In SIGCSE '12: Proceedings of the 43rd ACM technical symposium on Computer Science Education, Raleigh, NC, USA, February 29 -- March 3rd 2012. ACM. Google ScholarDigital Library
- A. Luxton-Reilly and P. Denny. Constructive evaluation: a pedagogy of student-contributed assessment. Computer Science Education, 20:145--167, 2010.Google ScholarCross Ref
- A. Luxton-Reilly, P. Denny, B. Plimmer, and D. Bertinshaw. Supporting student-generated free-response questions. In Proceedings of the 16th annual joint conference on Innovation and technology in computer science education, ITiCSE '11, pages 153--157, New York, NY, USA, 2011. ACM. Google ScholarDigital Library
- M. B. Miles and A. M. Huberman. Qualitative Data Analysis. SAGE Publications Inc., 1994.Google Scholar
- A. S. Palincsar and A. L. Brown. Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities. Cognition and Instruction, 1(2):117--175, 1984.Google ScholarCross Ref
- A. L. Strauss. Qualitative Analysis for Social Scientists. Cambridge University Press, 1987.Google ScholarCross Ref
- E. Williams. Student attitudes towards approaches to learning and assessment. Assessment & Evaluation in Higher Education, 17(1):45--58, 1992.Google ScholarCross Ref
- E. V. Wilson. Examnet asynchronous learning network: augmenting face-to-face courses with student-developed exam questions. Computers & Education, 42(1):87 -- 107, 2004.Google ScholarCross Ref
- F.-Y. Yu. Scaffolding student-generated questions: Design and development of a customizable online learning system. Computers in Human Behaviour, 25:1129--1138, 2009. Google ScholarDigital Library
- F.-Y. Yu, Y. H. Liu, and T.-W. Chan. The efficacy of a web-based domain independent question-posing and peer assessment learning system. Computers in Education, International Conference on, 0:641, 2002. Google ScholarDigital Library
Index Terms
- Activities, affordances and attitude: how student-generated questions assist learning
Recommendations
The impact of question generation activities on performance
SIGCSE '12: Proceedings of the 43rd ACM technical symposium on Computer Science EducationRecent interest in student-centric pedagogies have resulted in the development of numerous tools that support student generated questions. Previous evaluations of such tools have reported strong correlations between student participation and exam ...
Supporting student-generated free-response questions
ITiCSE '11: Proceedings of the 16th annual joint conference on Innovation and technology in computer science educationAlthough a number of existing systems support student-generated multiple choice questions, such questions tend to focus on lower-order cognitive skills. Free response questions are frequently used to evaluate higher-order thinking, but supporting ...
Increasing Access, Social Inclusion, and Quality Through Mobile Learning
Mobile learning is part of a new learning landscape created by the availability of technologies and increasing digitization. As the use of mobile technology has increased worldwide, interest has grown in its potential for supporting flexible, accessible,...
Comments