skip to main content
10.1145/1054972.1054993acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Use of eye movements as feedforward training for a synthetic aircraft inspection task

Published: 02 April 2005 Publication History

Abstract

Aircraft inspection is a vital element in assuring safety and reliability of the air transportation system. The human inspector performing visual inspection of an aircraft is the backbone of this process and training is an effective strategy for improving their inspection performance. Previous studies have shown offline feedback training to be effective in improving subsequent visual inspection performance. Because experienced inspectors are known to adopt a better inspection strategy than novices, providing visualization of experts' cognitive processes a priori can accelerate novices' adoption of the experts' strategy. Using eye tracking equipment, we record the point of regard of an expert inspector performing an inspection task in a virtual reality simulator. Analysis of their eye movements leads to a visualization of their scanpaths and allows us to display the inspector's visual search (hence cognitive) strategy. We show how providing this type of scanpath-based feedforward training of novices leads to improved accuracy performance in the simulator coupled with an observed speed-accuracy tradeoff. We contend that the tradeoff results from trained novices adopting a slower paced strategy through increased fixation durations, suggesting trained novices learn a more deliberate target search/discrimination strategy that requires more time to execute.

References

[1]
J. H. Bertera and K. Rayner. Eye Movements and the Span of the Effective Stimulus in Visual Search. Perception & Psychophysics 62,3 (2000), 576--585.
[2]
D. T. Campbell and J. C. Stanley. Experimental and Quasi-Experimental Designs for Research. Houghton Mifflin Co. (1966).
[3]
S. J. Czaja and C. G. Drury. Training Programs for Inspection. Human Factors 23,4 (1981), 473--484.
[4]
T. J. Doll. Preattentive Processing in Visual Search. In Proc. HFES (1993), 1291--1249.
[5]
C. G. Drury. The Maintenance Technician in Inspection. In Human Factors in Aviation Maintenance Phase 1: Progress Report (1991), Washington, D.C., 45--91.
[6]
C. G. Drury, A. K. Gramopadhye, and J. Sharit. Feedback Strategies for Visual Inspection in Airframe Structural Inspection. International Journal of Industrial Ergonomics 19 (1997), 333--344.
[7]
A. Duchowski, E. Medlin, N. Cournia, A. Gramopadhye, B. Melloy, and S. Nair. 3D Eye Movement Analysis for Visual Inspection Training. In Proc. ETRA 2002, ACM Press (2002), 103--110.
[8]
A. Duchowski, E. Medlin, A. Gramopadhye, B. Melloy, and S. Nair. Binocular Eye Tracking in VR for Visual Inspection Training. In Proc. VRST, ACM Press (2001), 1--8,193.
[9]
A. T. Duchowski, N. Cournia, B. Cumming, D. McCallum, A. K. Gramopadhye, J. S. Greenstein, S. Sadasivan, and R. A. Tyrrell. Visual Deictic Reference in a Collaborative Virtual Environment. In Proc. ETRA 2004, ACM Press (2004), 35--40.
[10]
T. J. Gallwey and C. G. Drury. Task Complexity in Visual Inspection. Human Factors 28 (1986), 595--606.
[11]
A. K. Gramopadhye, C. G. Drury, and P. V. Prabhu. Training Strategies for Visual Inspection. Human Factors and Ergonomics in Manufacturing 7,3 (1997), 171--196.
[12]
H. H. Greene and K. Rayner. Eye Movements and Familiarity Effects in Visual Search. Vision Research 41,27 (2001), 3763--3773.
[13]
J. M. Henderson. Visual Attention and Eye Movement Control During Reading and Picture Viewing. In K. Rayner, ed., Eye Movements and Visual Cognition: Scene Perception and Reading, Springer-Verlag (1992), 260--283.
[14]
L. Itti, C. Koch, and E. Niebur. A Model of Saliency-Based Visual Attention for Rapid Scene Analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI) 20,11 (1998), 1254--1259.
[15]
S. Kaewkuekool, M. Khasawneh, S. Bowling, A. Gramopadhye, A. Duchowski, and B. Melloy. Using Virtual Reality to Support Feedback/Feedforward Training to Improve Human Performance. In Proc. IERC 2002, IIE (2002).
[16]
H. L. Kundel and P. S. LaFollette. Visual Search Patterns and Experience with Radiological Images. Radiology 103 (1972), 523--528.
[17]
H. L. Kundel, C. F. Nodine, and E. A. Krupinski. Computer-Displayed Eye Position as a Visual Aid to Pulmonary Nodule Interpretation. Investigative Radiology 25 (1990), 890--896.
[18]
B. Law, M. S. Atkins, A. E. Kirkpatrick, A. J. Lomax, and C. L. MacKenzie. Eye Gaze Patterns Differentiate Novice and Experts in a Laparoscopic Surgery Training Environment. In Proc. ETRA 2004, ACM Press (2004), 41--48.
[19]
E. D. Megaw and J. Richardson. Eye movements and industrial inspection. Applied Ergonomics 10,3 (1979), 145--154.
[20]
C. Mello-Thoms, C. F. Nodine, and H. L. Kundel. What Attracts the Eye to the Location of Missed and Reported Breast Cancers? In Proc. ETRA 2002, ACM Press (2002), 111--117.
[21]
S. Nair, E. Medlin, J. Vora, A. Gramopadhye, A. T. Duchowski, B. Melloy, and B. Kanki. Cognitive Feedback Training Using 3D Binocular Eye Tracker. In Proc. HFES (2001), 1838--1842.
[22]
W. Osberger and A. J. Maeder. Automatic Identification of Perceptually Important Regions in an Image. In Proc. ICPR (1998).
[23]
A. Pollatsek and K. Rayner. What Is Integrated Across Fixations? In K. Rayner, ed., Eye Movements and Visual Cognition: Scene Perception and Reading, Springer-Verlag (1992), 166--191.
[24]
K. Rayner. Eye Movements in Reading and Information Processing: 20 Years of Research. Psychological Bulletin 124,3 (1998), 372--422.
[25]
E. M. Reingold, N. Charness, M. Pomplun, and D. M. Stampe. Visual Span in Expert Chess Players: Evidence from Eye Movements. Psychological Science (2002).
[26]
S. Sadasivan, D. Nalanagula, J. S. Greenstein, A. T. Duchowski, and A. K. Gramopadhye. Evaluating Advanced Displays to Provide Search Strategy Information in a Virtual Reality Training Environment. In Proc. International Conference on Industrial Engineering Theory, Applications, and Practice (2003).
[27]
J. W. Schoonard, J. D. Gould, and L. A. Miller. Studies of Visual Inspection. Ergonomics 16,4 (1973), 365--379.
[28]
E. R. Tufte. Visual Explanations: Images and Quantities, Evidence and Narrative. Graphics Press (1997).
[29]
B. M. Velichkovsky. Communicating Attention-Gaze Position Transfer in Cooperative Problem Solving. Pragmatics and Cognition 3,2 (1995), 199--222.
[30]
R. Vertegaal. The GAZE Groupware System: Mediating Joint Attention in Mutiparty Communication and Collaboration. In Proc. CHI 1999, ACM Press (1999), 294--301.
[31]
J. Vora, S. Nair, E. Medlin, A. Gramopadhye, A. T. Duchowski, B. Melloy, and B. Kanki. Using Virtual Technology to Improve Aircraft Inspection Performance: Presence and Performance Measurement Studies. In Proc. HFES (2001), 1867--1871.
[32]
M.-J. J. Wang, S.-C. Lin, and C. G. Drury. Training for strategy in visual search. Industrial Ergonomics 20 (1997), 101--108.
[33]
J. M. Wolfe. Guided Search 2.0: The Upgrade. In Proc. HFES (1993), 1295--1299.
[34]
A. L. Yarbus. Eye Movements and Vision. Plenum Press (1967).

Cited By

View all
  • (2024)Integration of eye-tracking and object detection in a deep learning system for quality inspection analysisJournal of Computational Design and Engineering10.1093/jcde/qwae04211:3(158-173)Online publication date: 6-May-2024
  • (2024)Attention computing for enhanced visuomotor skill performance: Testing the effectiveness of gaze-adaptive cues in virtual reality golf puttingMultimedia Tools and Applications10.1007/s11042-023-17973-483:21(60861-60879)Online publication date: 4-Jan-2024
  • (2024)Gaze cueing improves pattern recognition of histology learnersAnatomical Sciences Education10.1002/ase.249817:7(1461-1472)Online publication date: 12-Aug-2024
  • Show More Cited By

Index Terms

  1. Use of eye movements as feedforward training for a synthetic aircraft inspection task

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '05: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      April 2005
      928 pages
      ISBN:1581139985
      DOI:10.1145/1054972
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 02 April 2005

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. eye tracking
      2. virtual reality
      3. visual search

      Qualifiers

      • Article

      Conference

      CHI05
      Sponsor:

      Acceptance Rates

      CHI '05 Paper Acceptance Rate 93 of 372 submissions, 25%;
      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)42
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 17 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Integration of eye-tracking and object detection in a deep learning system for quality inspection analysisJournal of Computational Design and Engineering10.1093/jcde/qwae04211:3(158-173)Online publication date: 6-May-2024
      • (2024)Attention computing for enhanced visuomotor skill performance: Testing the effectiveness of gaze-adaptive cues in virtual reality golf puttingMultimedia Tools and Applications10.1007/s11042-023-17973-483:21(60861-60879)Online publication date: 4-Jan-2024
      • (2024)Gaze cueing improves pattern recognition of histology learnersAnatomical Sciences Education10.1002/ase.249817:7(1461-1472)Online publication date: 12-Aug-2024
      • (2023)Cueing Sequential 6DoF Rigid-Body Transformations in Augmented Reality2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00050(356-365)Online publication date: 16-Oct-2023
      • (2023)Virtual Reality System using Explainable AI for Identification of Specific Expert Refinery Inspection Skills2023 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM)10.1109/AIM46323.2023.10196157(1214-1219)Online publication date: 28-Jun-2023
      • (2022)注意引导和认知加工:眼动榜样样例的教学作用Advances in Psychological Science10.3724/SP.J.1042.2018.0140426:8(1404-1416)Online publication date: 13-Jul-2022
      • (2022)Eye-Tracking in Immersive Virtual Reality for Education: A Review of the Current Progress and ApplicationsFrontiers in Education10.3389/feduc.2022.6970327Online publication date: 10-Mar-2022
      • (2022)Carousel: Improving the Accuracy of Virtual Reality Assessments for Inspection Training TasksProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565618(1-10)Online publication date: 29-Nov-2022
      • (2022)Precueing Object Placement and Orientation for Manual Tasks in Augmented RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320311128:11(3799-3809)Online publication date: 1-Nov-2022
      • (2022)Evaluation of expert skills in refinery patrol inspection: visual attention and head positioning behaviorHeliyon10.1016/j.heliyon.2022.e12117(e12117)Online publication date: Dec-2022
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media