skip to main content
10.1145/1182475.1182496acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
Article

Systematic evaluation of e-learning systems: an experimental validation

Published: 14 October 2006 Publication History

Abstract

The evaluation of e-learning applications deserves special attention and evaluators need effective methodologies and appropriate guidelines to perform their task. We have proposed a methodology, called eLSE (e-Learning Systematic Evaluation), which combines a specific inspection technique with user-testing. This inspection aims at allowing inspectors that may not have a wide experience in evaluating e-learning systems to perform accurate evaluations. It is based on the use of evaluation patterns, called Abstract Tasks (ATs), which precisely describe the activities to be performed during inspection. For this reason, it is called AT inspection. In this paper, we present an empirical validation of the AT inspection technique: three groups of novice inspectors evaluated a commercial e-learning system applying the AT inspection, the heuristic inspection, or user-testing. Results have shown an advantage of the AT inspection over the other two usability evaluation methods, demonstrating that Abstract Tasks are effective and efficient tools to drive evaluators and improve their performance. Important methodological considerations on the reliability of usability evaluation techniques are discussed.

References

[1]
Ardito C., Costabile M. F., De Marsico M., Lanzilotti R., Levialdi S., Roselli T., Rossano V. An Approach to Usability Evaluation of e-Learning Applications. Universal Access in the Information Society International Journal, 4, 3 (2006), 270--283.
[2]
De Angeli, A., Matera, M., Costabile, M. F., Garzotto, F., and Paolini, P. On the Advantages of a Systematic Inspection for Evaluating Hypermedia Usability. International Journal of Human-Computer Interaction, Lawrence Erlbaum Associates, Inc, 15, 3 (2003), 315--335.
[3]
Dix, A., Finlay, J., Abowd, G., and Beale, R. Human-Computer Interaction (3rd Edition), London: Prentice Hall Europe, 2003.
[4]
Doubleday, A., Ryan, M., Springett, M., and Sutcliffe, A. A Comparison of Usability Techniques for Evaluating Design. In Proc. DIS'97, ACM Press (1997), 101--110.
[5]
International Organization for Standardization. ISO 9241: Software Ergonomics Requirements for Office Work with Visual Display Terminal (VDT). (1998) Geneva, Switzerland
[6]
Jeffries, R. and Desurvies, H. W. Usability testing vs. Heuristic Evaluation: was There a Context?, ACM SIGCHI Bulletin, 24, 4 (October 1992), 39--41.
[7]
Lanzilotti, R. A Holistic Approach to Designing and Evaluating e-Learning Quality: Usability and Educational Effectiveness. PhD dissertation, Dip. Informatica, Università di Bari, Bari, Italy, 2006.
[8]
Matera, M., Costabile, M. F., Garzotto, F., and Paolini, P. SUE Inspection: an Effective Method for Systematic Usability Evaluation of Hypermedia. IEEE Transactions on Systems, Man and Cybernetics - Part A, 32, 1 (2002), 93--103.
[9]
Nielsen J. Usability Engineering. Academic Press. Cambridge, MA. 1993.
[10]
Parlangeli, O., Marchigiani, E., and Bagnara, S. Multimedia System in Distance Education: Effects on Usability. Interacting with Computers, Elsevier Science Ltd, Great Britain, 12 (1999), 37--49.
[11]
Quinn, C. N., Alem, L., and Eklund, J. A pragmatic evaluation methodology for an assessment of learning effectiveness in instructional systems. http://www.testingcentre.com/jeklund/Interact.htm.
[12]
Squires, D. and Preece, J. Predicting quality in Educational Software: Evaluating for Learning, Usability, and the Synergy between them. Interacting with Computers, Elsevier Science Ltd, Great Britain, 11, 5 (1999), 467--483.
[13]
Storey, M. A., Philipps, B., Maczewski, M., and Wang, M. Evaluating the usability of Web-Based Learning Tools. Education Technology & Society, 5, 3 (2002), 91--100.
[14]
Wong, B., Nguyen, T. T., Chang, E., and Jayaratna, N. Usability Metrics for E-Learning. Workshop on Human Computer Interface for Semantic Web and Web Applications, Springer-Verlag, Heidelberg, Germany, LNCS No. 2889 (2003), 235--252.
[15]
Zaharias P., Vasslopoulou K., and Poulymenakou A. Designing On-Line Learning Courses: Implications for Usability. www.japit.org/zaharias_etal02.pdf, 2002.

Cited By

View all
  • (2024)From human-centered to symbiotic artificial intelligence: a focus on medical applicationsMultimedia Tools and Applications10.1007/s11042-024-20414-5Online publication date: 28-Nov-2024
  • (2019)Examining user experience of eLearning systems implemented in two universities in TanzaniaInteractive Technology and Smart Education10.1108/ITSE-05-2019-002517:1(39-55)Online publication date: 27-Sep-2019
  • (2017)A Conceptual Framework for Learning Systems Evaluation2017 5th International Conference in Software Engineering Research and Innovation (CONISOFT)10.1109/CONISOFT.2017.00028(171-178)Online publication date: Oct-2017
  • Show More Cited By

Index Terms

  1. Systematic evaluation of e-learning systems: an experimental validation

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    NordiCHI '06: Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
    October 2006
    517 pages
    ISBN:1595933255
    DOI:10.1145/1182475
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 14 October 2006

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. controlled experiment
    2. e-learning system evaluation
    3. usability evaluation techniques

    Qualifiers

    • Article

    Conference

    NORDICHI06

    Acceptance Rates

    Overall Acceptance Rate 379 of 1,572 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)7
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 17 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)From human-centered to symbiotic artificial intelligence: a focus on medical applicationsMultimedia Tools and Applications10.1007/s11042-024-20414-5Online publication date: 28-Nov-2024
    • (2019)Examining user experience of eLearning systems implemented in two universities in TanzaniaInteractive Technology and Smart Education10.1108/ITSE-05-2019-002517:1(39-55)Online publication date: 27-Sep-2019
    • (2017)A Conceptual Framework for Learning Systems Evaluation2017 5th International Conference in Software Engineering Research and Innovation (CONISOFT)10.1109/CONISOFT.2017.00028(171-178)Online publication date: Oct-2017
    • (2016)A System for English Vocabulary Acquisition based on Code-SwitchingInternational Journal of Distance Education Technologies10.4018/IJDET.201607010414:3(52-75)Online publication date: 1-Jul-2016
    • (2014)Barefoot usability evaluationsBehaviour & Information Technology10.1080/0144929X.2014.88355233:11(1148-1167)Online publication date: 1-Nov-2014
    • (2014)Can Evaluation Patterns Enable End Users to Evaluate the Quality of an e-learning System? An Exploratory StudyUniversal Access in Human-Computer Interaction. Universal Access to Information and Knowledge10.1007/978-3-319-07440-5_18(185-196)Online publication date: 2014
    • (2012)Training software development practitioners in usability testingProceedings of the 24th Australian Computer-Human Interaction Conference10.1145/2414536.2414545(52-60)Online publication date: 26-Nov-2012
    • (2011)The effect of system usability and multitasking activities in distance learningProceedings of the 9th ACM SIGCHI Italian Chapter International Conference on Computer-Human Interaction: Facing Complexity10.1145/2037296.2037314(59-64)Online publication date: 13-Sep-2011
    • (2011)Do patterns help novice evaluators? A comparative studyInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2010.07.00569:1-2(52-69)Online publication date: 1-Jan-2011
    • (2010)Training software developers in usability engineeringProceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries10.1145/1868914.1868928(82-91)Online publication date: 16-Oct-2010
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media