skip to main content
10.1145/2723576.2723625acmotherconferencesArticle/Chapter ViewAbstractPublication PageslakConference Proceedingsconference-collections
short-paper

Combining observational and experiential data to inform the redesign of learning activities

Published: 16 March 2015 Publication History

Abstract

A main goal for learning analytics is to inform the design of a learning experience to improve its quality. The increasing presence of solutions based on big data has even questioned the validity of current scientific methods. Is this going to happen in the area of learning analytics? In this paper we postulate that if changes are driven solely by a digital footprint, there is a risk of focusing only on factors that are directly connected to numeric methods. However, if the changes are complemented with an understanding about how students approach their learning, the quality of the evidence used in the redesign is significantly increased. This reasoning is illustrated with a case study in which an initial set of activities for a first year engineering course were shaped based only on the student's digital footprint. These activities were significantly modified after collecting qualitative data about the students approach to learning. We conclude the paper arguing that the interpretation of the meaning of learning analytics is improved when combined with qualitative data which reveals how and why students engaged with the learning tasks in qualitatively different ways, which together provide a more informed basis for designing learning activities.

References

[1]
Anderson, C. 2008. The End of Theory: The Data Deluge Makes the Scientific Method Obsolete. Wired Magazine.
[2]
Antunes, C. 2010. Anticipating student's failure as soon as possible. Handbook of Educational Data Mining. C. Romero, S. Ventura, M. Pechenizkiy, and R. S. J. d. Baker, eds. CRC Press. 353.
[3]
Arnold, K. E., Hall, Y., Street, S. G., Lafayette, W. and Pistilli, M. D. 2012. Course Signals at Purdue: Using Learning Analytics to Increase Student Success. International Conference on Learning Analytics and Knowledge (2012), 267--270.
[4]
Baker, R. and Siemens, G. 2014. Educational data mining and learning analytics. The Cambridge Handbook of the Learning Sciences. R. K. Sawyer, ed. Cambridge University Press.
[5]
Bramucci, R. and Gaston, J. 2012. Sherpa: increasing student success with a recommendation engine. International Conference on Learning Analytics and Knowledge (2012), 82--83.
[6]
Essa, A. and Ayad, H. 2012. Improving student success using predictive models and data visualisations. Research in Learning Technology. 5, (2012), 58--70.
[7]
Essa, A. and Ayad, H. 2012. Student success system: Risk Analytics and Data Visualization using Ensembles of Predictive Models. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge - LAK '12 (New York, New York, USA, Apr. 2012), 158--161.
[8]
Knight, S., Shum, S. B. and Littleton, K. 2014. Epistemology, assessment, pedagogy: where learning meets analytics in the middle space. Journal of Learning Analytics. 1, 1 (2014), 23--47.
[9]
Lockyer, L., Heathcote, E. and Dawson, S. 2013. Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist. 57, 10 (Mar. 2013), 1439--1459.
[10]
McAfee, A. and Brynjolfsson, E. 2012. Big Data. Harvard Business Review. October (2012), 60--69.
[11]
Pintrich, P. R. 2004. A Conceptual Framework for Assessing Motivation and Self-Regulated Learning in College Students. Educational Psychology Review. 16, 4 (Dec. 2004), 385--407.
[12]
Prosser, M. and Trigwell, K. 1999. Understanding Learning and Teaching. The Experience in Higher Education. Society for Research in Higher Education & Open University Press.
[13]
Ramsden, P. 2003. Learning to Teach in Higher Education. Routledge.
[14]
Romero, C., López, M.-I., Luna, J.-M. and Ventura, S. 2013. Predicting students' final performance from participation in on-line discussion forums. Computers & Education. 68, (Oct. 2013), 458--472.
[15]
Romero, C., Ventura, S. and Garcia, E. 2008. Data mining in course management systems: Moodle case study and tutorial. Computers & Education. 51, 1 (Aug. 2008), 368--384.
[16]
Suthers, D. and Road, E. W. 2013. Learning Analytics as a "Middle Space." Proceedings of the International Conference on Learning Analytics and Knowledge (2013), 2--5.
[17]
Verbert, K. and Duval, E. 2012. Learning Analytics. Elearning and Education. 8 (2012).
[18]
Wise, A. F. 2014. Designing Pedagogical Interventions to Support Student Use of Learning Analytics. Proceedings of the International Conference on Learning Analytics and Knowledge (2014).

Cited By

View all
  • (2023)Using Written Responses to Reflection Questions to Improve Online Student RetentionSupporting Self-Regulated Learning and Student Success in Online Courses10.4018/978-1-6684-6500-4.ch013(282-303)Online publication date: 24-Feb-2023
  • (2023)Embracing Trustworthiness and Authenticity in the Validation of Learning Analytics SystemsLAK23: 13th International Learning Analytics and Knowledge Conference10.1145/3576050.3576060(552-558)Online publication date: 13-Mar-2023
  • (2022)Do Student-Written Responses to Reflection Questions Predict Persistence and Performance in Online Courses?Research Anthology on Remote Teaching and Learning and the Future of Online Education10.4018/978-1-6684-7540-9.ch119(2401-2421)Online publication date: 2-Sep-2022
  • Show More Cited By

Index Terms

  1. Combining observational and experiential data to inform the redesign of learning activities

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    LAK '15: Proceedings of the Fifth International Conference on Learning Analytics And Knowledge
    March 2015
    448 pages
    ISBN:9781450334174
    DOI:10.1145/2723576
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 March 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. active learning
    2. approaches to learning
    3. interventions
    4. learning analytics
    5. mixed methods analysis

    Qualifiers

    • Short-paper

    Conference

    LAK '15

    Acceptance Rates

    LAK '15 Paper Acceptance Rate 20 of 74 submissions, 27%;
    Overall Acceptance Rate 236 of 782 submissions, 30%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)10
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 20 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Using Written Responses to Reflection Questions to Improve Online Student RetentionSupporting Self-Regulated Learning and Student Success in Online Courses10.4018/978-1-6684-6500-4.ch013(282-303)Online publication date: 24-Feb-2023
    • (2023)Embracing Trustworthiness and Authenticity in the Validation of Learning Analytics SystemsLAK23: 13th International Learning Analytics and Knowledge Conference10.1145/3576050.3576060(552-558)Online publication date: 13-Mar-2023
    • (2022)Do Student-Written Responses to Reflection Questions Predict Persistence and Performance in Online Courses?Research Anthology on Remote Teaching and Learning and the Future of Online Education10.4018/978-1-6684-7540-9.ch119(2401-2421)Online publication date: 2-Sep-2022
    • (2021)Students’ Emotional Reactions to Social Comparison via a Learner DashboardVisualizations and Dashboards for Learning Analytics10.1007/978-3-030-81222-5_11(233-249)Online publication date: 17-Dec-2021
    • (2020)Do Student-Written Responses to Reflection Questions Predict Persistence and Performance in Online Courses?Early Warning Systems and Targeted Interventions for Student Success in Online Courses10.4018/978-1-7998-5074-8.ch001(1-21)Online publication date: 2020
    • (2020)Digital footprints (2005–2019): a systematic mapping of studies in educationInteractive Learning Environments10.1080/10494820.2020.181482131:2(876-889)Online publication date: 17-Sep-2020
    • (2020)An Interactive Virtual E-Learning Framework Using Crowdsourced AnalyticsArtificial Intelligence Techniques for Advanced Computing Applications10.1007/978-981-15-5329-5_12(127-136)Online publication date: 24-Jul-2020
    • (2019)A Conversation between Learning Design and Classroom Observations: A Systematic Literature ReviewEducation Sciences10.3390/educsci90200919:2(91)Online publication date: 26-Apr-2019
    • (2018)Instructors' Perceptions of Networked Learning and Analytics | Perceptions des instructeurs quant à l'apprentissage et l'analyse en réseauCanadian Journal of Learning and Technology10.21432/cjlt2764444:3Online publication date: 31-Dec-2018
    • (2018)Education, Technology and Design: A Much Needed Interdisciplinary CollaborationDesigning for the User Experience in Learning Systems10.1007/978-3-319-94794-5_2(17-39)Online publication date: 26-Sep-2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media