skip to main content
10.1145/2667317.2667413acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
tutorial

Interactive Displays in Vehicles: Improving Usability with a Pointing Gesture Tracker and Bayesian Intent Predictors

Authors Info & Claims
Published:17 September 2014Publication History

ABSTRACT

Interactive displays are becoming an integrated part of the modern vehicle environment. Their use typically entails dedicating a considerable amount of attention and undertaking a pointing gesture to select an interface item/icon displayed on a touchscreen. This can have serious safety implications for the driver. The pointing gesture can also be highly perturbed due to the road and driving conditions, resulting in erroneous selections. In this paper, we propose a probabilistic intent prediction approach that facilitates establishing the targeted icon on the interface early in the pointing gesture. It employs a 3D vision sensory device to continuously track the pointing hand/finger in conjunction with suitable Bayesian prediction algorithms. The introduced technique can significantly reduce the pointing task completion time, the necessary associated visual, cognitive and movement efforts as well as enhance the selection accuracy. The substantial furnished gains and the pointing gesture characteristics are demonstrated using data collected in an instrumented vehicle.

References

  1. Leap Motion Website: https://www.leapmotion.com/.Google ScholarGoogle Scholar
  2. Ahmad, B. I., Murphy, J. K., Langdon, P. M., and Godsill, S. J. Bayesian target prediction from partial finger tracks: Aiding interactive displays in vehicles. In Proc. of the 17th International Conference on Information Fusion (FUSION '14) (2014).Google ScholarGoogle Scholar
  3. Ahmad, B. I., Murphy, J. K., Langdon, P. M., and Godsill, S. J. Filtering perturbed in-vehicle pointing gesture trajectories: Improving the reliability of intent inference. In Proc. of IEEE International Workshop on Machine Learning for Signal Processing (MLSP '14) (2014).Google ScholarGoogle ScholarCross RefCross Ref
  4. Asano, T., Sharlin, E., Kitamura, Y., Takashima, K., and Kishino, F. Predictive interaction using the delphian desktop. In Proc. of the ACM Symp. on User Interface Software and Technology, ACM (2005), 133--141. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Bragdon, A., Nelson, E., Li, Y., and Hinckley, K. Experimental analysis of touch-screen gesture designs in mobile environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM (2011), 403--412. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Burnett, G. E., and Mark Porter, J. Ubiquitous computing within cars: designing controls for non-visual use. International Journal of Human-Computer Studies 55, 4 (2001), 521--531. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Fitts, P. M., and Peterson, J. R. Information capacity of discrete motor responses. Journal of experimental psychology 67, 2 (1964), 103.Google ScholarGoogle Scholar
  8. Garber, L. Gestural technology: Moving interfaces in a new direction {technology news}. Computer, IEEE 46, 10 (2013), 22--25. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Godsill, S. J., Vermaak, J., Ng, W., and Li, J. F. Models and algorithms for tracking of maneuvering objects using variable rate particle filters. Proceedings of the IEEE 95 (2007), 925--952.Google ScholarGoogle ScholarCross RefCross Ref
  10. Goel, M., Findlater, L., and Wobbrock, J. Walktype: using accelerometer data to accomodate situational impairments in mobile touch screen text entry. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems, ACM (2012), 2687--2696. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Harvey, C., and Stanton, N. A. Usability evaluation for in-vehicle systems. CRC Press, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  12. Jæger, M. G., Skov, M. B., Thomassen, N. G., et al. You can touch, but you can't look: interacting with in-vehicle systems. In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems (2008), 1139--1148. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Klauer, S. G., Dingus, T. A., Neale, V. L., Sudweeks, J. D., and Ramsey, D. J. The impact of driver inattention on near-crash/crash risk: An analysis using the 100-car naturalistic driving study data, National Highway Traffic Safety Administration, DOT HS 810 5942006, 2006.Google ScholarGoogle Scholar
  14. Kopper, R., Bowman, D. A., Silva, M. G., and McMahan, R. P. A human motor behavior model for distal pointing tasks. International Journal of Human-Computer Studies 68, 10 (2010), 603--615. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Lane, D., Peres, S., Sándor, A., and Napier, H. A process for anticipating and executing icon selection in graphical user interfaces. International Journal of Human-Computer Interaction 19, 2 (2005), 241--252.Google ScholarGoogle ScholarCross RefCross Ref
  16. Lank, E., Cheng, Y.-C. N., and Ruiz, J. Endpoint prediction using motion kinematics. In Proc. of the SIGCHI Conf. on Human factors in Computing Systems (2007), 637--646. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Liang, Y., and Lee, J. D. Combining cognitive and visual distraction: Less than the sum of its parts. Accident Analysis & Prevention 42, 3 (2010), 881--890.Google ScholarGoogle ScholarCross RefCross Ref
  18. MacKenzie, I. S., and Isokoski, P. Fitts' throughput and the speed-accuracy tradeoff. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems (2008), 1633--1636. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. McGuffin, M. J., and Balakrishnan, R. Fitts' law and expanding targets: Experimental studies and designs for user interfaces. ACM Transactions on Computer-Human Interaction 12, 4 (2005), 388--422. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Murata, A. Improvement of pointing time by predicting targets in pointing with a PC mouse. IJHCI 10, 1 (1998), 23--32.Google ScholarGoogle Scholar
  21. Neale, V. L., Dingus, T. A., Klauer, S. G., Sudweeks, J., and Goodman, M. An overview of the 100-car naturalistic study and findings. National Highway Traffic Safety Administration, Paper, 05-0400 (2005).Google ScholarGoogle Scholar
  22. Nguyen, H. GPU Gems 3. Addison-Wesley Professional, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Oulasvirta, A., Reichel, A., Li, W., Zhang, Y., Bachynskyi, M., Vertanen, K., and Kristensson, P. O. Improving two-thumb text entry on touchscreen devices. In Proc.of the SIGCHI Conference on Human Factors in Computing Systems (2013), 2765--2774. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Paxton, F. Solid angle calculation for a circular disk. Review of Scientific Instruments 30, 4 (1959), 254--258.Google ScholarGoogle ScholarCross RefCross Ref
  25. Pitts, M. J., Burnett, G., Skrypchuk, L., Wellings, T., Attridge, A., and Williams, M. A. Visual--haptic feedback interaction in automotive touchscreens. Displays 33, 1 (2012), 7--16.Google ScholarGoogle ScholarCross RefCross Ref
  26. Salmon, P., Lenné, M., Triggs, T., Goode, N., Cornelissen, M., and Demczuk, V. The effects of motion on in-vehicle touch screen system operation: A battle management system case study. Transportation research: traffic psychology and behaviour 14, 6 (2011), 494--503.Google ScholarGoogle Scholar
  27. Sears, A., Plaisant, C., and Shneiderman, B. A new era for touchscreen applications: High precision, dragging icons, and refined feedback. Advances in human-computer interaction 3 (1991).Google ScholarGoogle Scholar
  28. Swette, R., May, K. R., Gable, T. M., and Walker, B. N. Comparing three novel multimodal touch interfaces for infotainment menus. In Proceedings of the 5th AutomotiveUI, ACM (2013), 100--107. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Wobbrock, J. O., Fogarty, J., Liu, S., Kimuro, S., and Harada, S. The angle mouse: target-agnostic dynamic gain adjustment based on angular deviation. In Proc. of the SIGCHI Conf. on Human Factors in Computing Systems (2009), 1401--1410. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Wu, F.-G., Lin, H., and You, M. Direct-touch vs. mouse input for navigation modes of the web map. Displays 32, 5 (2011), 261--267.Google ScholarGoogle ScholarCross RefCross Ref
  31. Yamabe, T., and Takahashi, K. Experiments in mobile user interface adaptation for walking users. In International Conference on Intelligent Pervasive Computing, IEEE (2007), 280--284. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Interactive Displays in Vehicles: Improving Usability with a Pointing Gesture Tracker and Bayesian Intent Predictors

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      AutomotiveUI '14: Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
      September 2014
      287 pages
      ISBN:9781450332125
      DOI:10.1145/2667317

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 17 September 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • tutorial
      • Research
      • Refereed limited

      Acceptance Rates

      AutomotiveUI '14 Paper Acceptance Rate36of79submissions,46%Overall Acceptance Rate248of566submissions,44%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader