skip to main content
10.1145/1647314.1647383acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
poster

Augmented reality target finding based on tactile cues

Published: 02 November 2009 Publication History

Abstract

This study is based on a user scenario where augmented reality targets could be found by scanning the environment with a mobile device and getting a tactile feedback exactly in the direction of the target. In order to understand how accurately and quickly the targets can be found, we prepared an experiment setup where a sensor-actuator device consisting of orientation tracking hardware and a tactile actuator were used. The targets with widths 5°, 10°, 15°, 20°, and 25° and various distances between each other were rendered in a 90° -wide space successively, and the task of the test participants was to find them as quickly as possible. The experiment consisted of two conditions: the first one provided tactile feedback only when pointing was on the target and the second one included also another cue indicating the proximity of the target. The average target finding time was 1.8 seconds. The closest targets appeared to be not the easiest to find, which was attributed to the adapted scanning velocity causing the missing the closest targets. We also found that our data did not correlate well with Fitts' model, which may have been caused by the non-normal data distribution. After filtering out 30% of the least representative data items, the correlation reached up to 0.71. Overall, the performance between conditions did not differ from each other significantly. The only significant improvement in the performance offered by the close-to-target cue occurred in the tasks where the targets where the furthest from each other.

References

[1]
Kahl, G., Wasinger, R., Schwartz, T. and Spassova, L., Three Output Planning Strategies for Use in Context-aware Computing Scenarios, Proc. of the AISB Symposium on Multimodal Output Generation (MOG), Aberdeen, Scotland, UK, 2008, pp. 46--49.
[2]
Wasinger, R., Stahl, C., and Kruger, A., Robust Speech Interaction in a Mobile Environment through the use of Multiple and Different Media Input Types, in Proceedings of the 8th European Conference on Speech Communication and Technology (Eurospeech), pp. 1049--1052, (2003).
[3]
Marentakis, G. and Brewster, S.A. Gesture Interaction with Spatial Audio Displays: Effects of Target Size and Inter-Target Separation. In Proceedings of ICAD2005 (Limerick, Ireland), July 2005. ICAD, pp77--84.
[4]
Marentakis, G.N. and Brewster, S.A. Effects of Feedback, Mobility and Index of Difficulty on Deictic Spatial Audio Target Acquisition in the Horizontal Plane. In Proceedings of ACM CHI 2006 (Montreal, Canada), ACM Press Addison-Wesley, pp 359--368
[5]
Strachan, S. and Murray-Smith, R., GeoPoke: Rotational Mechanical Systems Metaphor for Embodied Geosocial Interaction, NordiCHI 2008: Using Bridges, 18--22 October, Lund, Sweden, 2008.
[6]
Strachan, S., Williamson, J. and Murray-Smith, R., Show me the way to Monte Carlo: density-based trajectory navigation, Proceedings of ACM SIG CHI Conference, San Jose, 2007, pp. 1245--1248
[7]
Williamson, J., Strachan, S. and Murray-Smith, R., It's a Long Way to Monte-Carlo: Probabilistic GPS Navigation, Proceedings of Mobile HCI 2006, Helsinki, 2006.
[8]
Robinson, S., Eslambolchilar, P. and Jones, M., Evaluating Haptics for Information Discovery While Walking, to appear in Proceedings of BCS HCI 2009, Cambridge, UK, September 2009.
[9]
Pesqueux, L. and Rouaud, M. "Vibration level of mobile phones' silent alerts," in Department of Acoustics. Aalborg: Aalborg University, 2005.
[10]
http://www.eaiinfo.com/Tactor%20Products.htm, 29.5.2009
[11]
Laurila, K., Pylvänäinen, T., Silanto, S., and Virolainen, A. "Wireless Motion Bands", position paper at UbiComp'05 Workshop on "Ubiquitous computing to support monitoring, measuring and motivating exercise", Tokyo, Japan, September 11--14, 2005
[12]
http://www.inition.co.uk/inition/pdf/ymocap_XSens_mt9.pdf, 29.5.2009.
[13]
Ahmaniemi, T., Lantz, V. and Marila, J.: Dynamic Audiotactile Feedback in Gesture Interaction. In Proceedings of the Mobile HCI 2008 September 2--5, 2008, Amsterdam, Netherlands. pp. 339--342.
[14]
Ahmaniemi, T., Lantz, V. and Marila, J.: Perception of Dynamic Audiotactile Feedback to Gesture Input. Proceedings of the 10th International Conference on Multimodal Interfaces, October 20--22, 2008, Chania, Crete, Greece. pp. 85--92.
[15]
Akamatsu, M., MacKenzie, I. S. and Hasbrouq, T. (1995). A comparison of tactile, auditory, and visual feedback in a pointing task using a mouse-type device. Ergonomics, 38, 816--827.
[16]
Tähkäpää, E. and Raisamo, R. Evaluating Tactile Feedback in Graphical User Interfaces. In proceedings of Eurohaptics (Edinburgh, UK) 2002.
[17]
Oron-Gilad, T.,Downs, J.L., Gilson, R.D. and Hancock, P.A.: Vibrotactile Guidance Cues for Target Acquisition. IEEE Transactions on Systems, Man, and Cybernetics, Part C 37(5): 993--1004 (2007).
[18]
Fitts, P.M., The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 47(6): p. 381--391 (1954).
[19]
Jagacinski, R.J. and Flach, J.M., Control Theory for Humans, Quantitative Approaches to Modeling Human Performance. Lawrence Erlbaum Associates, Inc. 2003, pp. 17--22, 75--76.
[20]
Crossan, A., Williamson, J., Brewster, S.A. and Murray-Smith, R. Wrist Rotation for Interaction in Mobile Contexts. In Proceedings of MobileHCI 2008 (Amsterdam, Holland), ACM Press, pp 435--438.
[21]
Cabral, M.C., Morimoto, C.H., and Zuffo, M.K. On the usability of gesture interfaces in virtual reality environments. Proceedings of the 2005 Latin American conference on Human-computer interaction. Cuernavaca, Mexico. pp. 100--108.
[22]
Yee, K-P., Peephole displays: Pen interaction on spatially aware handheld computers. In Proc. CHI 2003, ACM Press (2003), 1--8.
[23]
Cao, X., Li, J.J. and Balakrishnan, R., Peephole Pointing: Modeling Acquisition of Dynamically Revealed Targets, Proceeding of the 26th SIGCHI conference on Human factors in computing systems. Florence, Italy, pp. 1699--1708.
[24]
Rohs, M. and Oulasvirta, A., Target Acquisition with Camera Phones when used as Magic Lenses, Proceeding of the 26th SIGCHI conference on Human factors in computing systems. Florence, Italy, pp 1409--1418.
[25]
Andersen, T. H., A Simple Movement Time Model for Scrolling, CHI 2005 extended abstracts on Human factors in computing systems. Portland, OR, USA, pp. 1180--1183.

Cited By

View all
  • (2022)Multisensory Proximity and Transition Cues for Improving Target Awareness in Narrow Field of View Augmented Reality DisplaysIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2021.311667328:2(1342-1362)Online publication date: 1-Feb-2022
  • (2021)Real-Time Capture of Holistic Tangible InteractionsProceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3430524.3440658(1-15)Online publication date: 14-Feb-2021
  • (2020)Experimental Analysis of a Spatialised Audio Interface for People with Visual ImpairmentsACM Transactions on Accessible Computing10.1145/341232513:4(1-21)Online publication date: 15-Oct-2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI-MLMI '09: Proceedings of the 2009 international conference on Multimodal interfaces
November 2009
374 pages
ISBN:9781605587721
DOI:10.1145/1647314
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 November 2009

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Fitts' Law
  2. augmented reality
  3. haptics
  4. pointing

Qualifiers

  • Poster

Conference

ICMI-MLMI '09
Sponsor:

Acceptance Rates

Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)20
  • Downloads (Last 6 weeks)1
Reflects downloads up to 19 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2022)Multisensory Proximity and Transition Cues for Improving Target Awareness in Narrow Field of View Augmented Reality DisplaysIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2021.311667328:2(1342-1362)Online publication date: 1-Feb-2022
  • (2021)Real-Time Capture of Holistic Tangible InteractionsProceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3430524.3440658(1-15)Online publication date: 14-Feb-2021
  • (2020)Experimental Analysis of a Spatialised Audio Interface for People with Visual ImpairmentsACM Transactions on Accessible Computing10.1145/341232513:4(1-21)Online publication date: 15-Oct-2020
  • (2020)RunAhead: Exploring Head Scanning based Navigation for RunnersProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376828(1-13)Online publication date: 21-Apr-2020
  • (2019)Supporting Natural Navigation for Running in Unknown PlacesCompanion Publication of the 2019 on Designing Interactive Systems Conference 2019 Companion10.1145/3301019.3323895(277-281)Online publication date: 18-Jun-2019
  • (2019)Leveraging Distal Vibrotactile Feedback for Target AcquisitionProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300715(1-11)Online publication date: 2-May-2019
  • (2019)Non-Visual Cues for View Management in Narrow Field of View Augmented Reality Displays2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR.2019.000-3(190-201)Online publication date: Oct-2019
  • (2019)Haptic Augmented Reality (HapticAR) for assembly guidanceInternational Journal on Interactive Design and Manufacturing (IJIDeM)10.1007/s12008-019-00532-3Online publication date: 17-Jan-2019
  • (2019)Developing hand-worn input and haptic support for real-world target findingPersonal and Ubiquitous Computing10.1007/s00779-018-1180-z23:1(117-132)Online publication date: 1-Feb-2019
  • (2017)Focus-Plus-Context Techniques for Picoprojection-Based InteractionIEEE Transactions on Multimedia10.1109/TMM.2017.267341019:7(1521-1530)Online publication date: Jul-2017
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media