ABSTRACT
An interactive learning task was designed in a game format to help high school students acquire knowledge about a simple mechanical system involving a car moving on a ramp. This ramp game consisted of five challenges that addressed individual knowledge components with increasing difficulty. In order to investigate patterns of knowledge emergence during the ramp game, we applied the Monte Carlo Bayesian Knowledge Tracing (BKT) algorithm to 447 game segments produced by 64 student groups in two physics teachers' classrooms. Results indicate that, in the ramp game context, (1) the initial knowledge and guessing parameters were significantly highly correlated, (2) the slip parameter was interpretable monotonically, (3) low guessing parameter values were associated with knowledge emergence while high guessing parameter values were associated with knowledge maintenance, and (4) the transition parameter showed the speed of knowledge emergence. By applying the k-means clustering to ramp game segments represented in the three dimensional space defined by guessing, slip, and transition parameters, we identified seven clusters of knowledge emergence. We characterize these clusters and discuss implications for future research as well as for instructional game design.
- National Research Council, National science education standards. Washington, DC: National Academies Press, 1996.Google Scholar
- National Research Council, Taking science to school. Washington, DC: National Academies Press, 2007.Google Scholar
- National Research Council, A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press, 2012.Google Scholar
- M. A. Honey and M. Hilton, Learning science through computer games and simulations. Washington D. C.: The National Academies Press, 2011.Google Scholar
- J. Gobert, M. Sao Pedro, R. Baker, E. Toto, and O. Montalvo, "Leveraging educational data mining for real time performance assessment of scientific inquiry skills within microworlds," Journal of Educational Data Mining, vol. 4, pp. 111--143, 2012.Google Scholar
- J. Gobert, M. Sao Pedro, J. Raziuddin, and R. Baker, "From log files to assessment metrics for science inquiry using educational data mining," Journal of the Learning Sciences, vol. 22, pp. 521--563, 2013.Google ScholarCross Ref
- D. Kuhn, "Microgenetic study of change: What has it told us?," Psychological Science, vol. 6, pp. 133--139, 1995.Google ScholarCross Ref
- T. Martin and B. Sherin, "Learning analytics and computational techniques for detecting and evaluating patterns in learning: An introduction to the special issue," Journal of the Learning Sciences, vol. 22, pp. 511--520, 2013.Google ScholarCross Ref
- J. R. Anderson, C. F. Boyle, A. Corbett, and M. W. Lewis, "Cognitive modeling and intelligent tutoring," Artificial Intelligence, vol. 42, 1990. Google ScholarDigital Library
- D. C. Merrill, B. J. Reiser, M. Ranney, and J. G. Trafton, "Effective tutoring techniques: A comparison of human tutors and intelligent tutoring systems," Journal of the Learning Sciences, vol. 2, pp. 277--305, 1992.Google ScholarCross Ref
- J. L. Davernport, A. Raffety, M. J. Timms, D. Yaron, and M. Karabinos, "ChemVLab+: Evaluating a virtual lab tutor for high school chemistry," in International Conference of the Learning Sciences, 2012, pp. 381--385.Google Scholar
- E. S. Quellmalz, M. J. Timms, M. D. Silberglitt, and B. C. Buckley, "Science assessments for all: Integrating science simulations into balanced state science assessment systems," Journal of Research in Science Teaching, vol. 49, pp. 363--393, 2012.Google ScholarCross Ref
- A. Corbett and J. Anderson, "Knowledge-tracing: Modeling the acquisition of procedural knowledge," User Modeling and User Adopted Interaction, vol. 4, pp. 253--278, 1995.Google ScholarCross Ref
- R. Baker, A. T. Corbett, I. Roll, and K. R. Koedinger, "More accurate student modeling through contextual estimation of slip and guess probabilities in Bayesian Knowledge Tracing," in Proceedings of the 9th International Conference on Intelligent Tutoring Systems, 2008, pp. 406--415. Google ScholarDigital Library
- J. E. Beck, K. Chang, J. Mostow, and A. Corbett, "Does help help? Introducing the Bayesian evaluation and assessment methodology," in Proceedings of the 9th International Conference on Intelligent Tutoring Systems, 2008, pp. 383--394. Google ScholarDigital Library
- R. Baker, A. Hershkovitz, L. M. Rossi, A. B. Goldstein, and S. M. Gowda, "Predicting robust learning with the visual form of the moment-by-moment learning curve," Journal of the Learning Sciences, vol. 22, pp. 639--666, 2013.Google ScholarCross Ref
Index Terms
How does Bayesian knowledge tracing model emergence of knowledge about a mechanical system?
Recommendations
Tracking student progress in a game-like learning environment with a Monte Carlo Bayesian knowledge tracing model
LAK '15: Proceedings of the Fifth International Conference on Learning Analytics And KnowledgeThe Bayesian Knowledge Tracing (BKT) model is a popular model used for tracking student progress in learning systems such as an intelligent tutoring system. However, the model is not free of problems. Well-recognized problems include the identifiability ...
Genetic Algorithm for Bayesian Knowledge Tracing: A Practical Application
Advances in Swarm IntelligenceAbstractOnline intelligent tutoring systems have developed rapidly in recent years. Analyzing educational data to help students personalize learning has become a research hotspot. Knowledge Tracing (KT) aims to assess students’ changing cognitive states ...
Enhancing skill prediction through generalising Bayesian knowledge tracing
Learning Analytics (LA) have been widely investigated and applied to understand and optimise the learning process and environment. Among a number of LA tools, Bayesian Knowledge Tracing (BKT) was developed aiming at predicting the probability that a skill ...
Comments