skip to main content
10.1145/1226969.1227023acmotherconferencesArticle/Chapter ViewAbstractPublication PagesteiConference Proceedingsconference-collections
Article

Tap input as an embedded interaction method for mobile devices

Published: 15 February 2007 Publication History

Abstract

In this paper we describe a novel interaction method for interacting with mobile devices without the need to access a keypad or a display. A tap with a hand can be reliably detected e.g. through a pocket by means of an acceleration sensor. By carefully designing the user interface, the tap can be used to activate logically similar functionalities on the device, leading to a simple but useful interaction method. We present results of user tests aimed at studying the usability of various tap input based user interface applications.

References

[1]
Angesleva, J., Oakley, I., Hughes, S, O'Modhrain, S. Body Mnemonics: Portable Device Interaction Design Concept. In Proc. UIST'03, Vancouver, Canada, 2003.]]
[2]
Bartlett, J. F. Rock 'n' Scroll Is Here to Stay. IEEE Computer Graphics and Applications, 2000(May/June 2000): p. 40--45.]]
[3]
Carroll, J. M. Making of Use. Scenario-based design of human-computer interactions. 368 p. The MIT Press, 2000.]]
[4]
Drab, S. A., and N. M. Artner. Motion Detection as Interaction Technique for Games & Applications on Mobile Devices. In Proc. Pervasive Mobile Interaction Devices (PERMID), 2005, pp. 48--51, available online: http://www.medien.ifi.lmu.de/permid2005/pdf/Proceedings Permid2005.pdf]]
[5]
Hinckley, K. Synchronous gestures for multiple persons and computers, Proceedings of the 16th annual ACM symposium on User interface software and technology, p.149--158, November 02--05, 2003, Vancouver, Canada]]
[6]
Hinckley, K., Pierce, J., Sinclair, M., and Horvitz, E., "Sensing Techniques for Mobile Interaction", CHI Letters 2, 2, 2004, pp. 91--100]]
[7]
Iacucci, G., Kuutti, K, and Ranta, M. On the Move with a Magic Thing: Role Playing in Concept Design of Mobile Services and Devices. In Proc. of DIS'00, (2000), 193--202]]
[8]
Levin, G., and Yarin, P. Bringing Sketching Tools to Keychain Computers with an Acceleration-Based Interface. Proc. CHI '98 Extended Abstracts, ACM, New York, 1998, pp. 268--269.]]
[9]
Linjama, J., Häkkilä, J. & Ronkainen, S. (2005). Gesture interfaces for mobile devices - minimalist approach for haptic interaction. Position paper in CHI 2005 Workshop "Hands on Haptics", 3--4th April, Portland, Oregon. Available: http://www.dcs.gla.ac.uk/haptic/sub.html]]
[10]
Linjama, J., Kaaresoja, T. Novel, minimalist haptic gesture interaction for mobile devices. In Proc. of the third Nordic conference on Human-computer interaction, 457--458, October 23--27, 2004, Tampere, Finland]]
[11]
Mäntyjärvi, J. Kela, J., Korpipää, P., and Kallio, S., Enabling fast and Effortless customisation in accelerometer based gesture interaction. In Proc. of the International Conference on Mobile and Ubiquitous Multimedia (MUM), 2004, pp. 25--31.]]
[12]
Nam, Y., Wohn, K. Y. Recognition of space-time handgestures using Hidden Markov model. In Proc. of the ACM Symposium on Virtual Reality Software and Technology, 51--58, Hong Kong, 1996.]]
[13]
Oulasvirta, A., Kurvinen, E., and Kankainen, T. Understanding context by being there: case studies in bodystorming. Personal and Ubiquitous Computing 7 (2003): 125--134.]]
[14]
Oviatt, S. L., Levow, G., MacEarchern, M., and Kuhn, K. Modeling hyperarticulate speech during human-computer error resolution. In Proc. the International Conference on Spoken Language Processing, Philadelphia, PA, USA 3--6 Oct 1996, volume 2, pages 801--804, 1996.]]
[15]
Rekimoto, J. Tilting operations for small screen interfaces. In Proc. of the 9th annual ACM symposium on User interface software and technology, 167-168, November 06--08, 1996, Seattle, Washington, United States.]]
[16]
Sawada, H., Uta, S., and Hashimoto, S. Gesture Recognition for Human-Friendly Interface in Designer - Consumer Cooperate Design System. In Proc. of the 1999 IEEE International Workshop on Robot and Human Interaction, Pisa, Italy, September 1999, pp. 400--405]]

Cited By

View all
  • (2024)Make Interaction Situated: Designing User Acceptable Interaction for Situated Visualization in Public EnvironmentsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642049(1-21)Online publication date: 11-May-2024
  • (2024)A visual programming tool for mobile web augmentationKnowledge and Information Systems10.1007/s10115-023-02039-666:9(5631-5668)Online publication date: 23-May-2024
  • (2023)Surveying the Social Comfort of Body, Device, and Environment-Based Augmented Reality Interactions in Confined Passenger Spaces Using Mixed Reality Composite VideosProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109237:3(1-25)Online publication date: 27-Sep-2023
  • Show More Cited By

Index Terms

  1. Tap input as an embedded interaction method for mobile devices

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    TEI '07: Proceedings of the 1st international conference on Tangible and embedded interaction
    February 2007
    296 pages
    ISBN:9781595936196
    DOI:10.1145/1226969
    • Conference Chairs:
    • Brygg Ullmer,
    • Albrecht Schmidt
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • CCT: LSU Center for Computation and Technology

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 15 February 2007

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gesture input
    2. haptic user interfaces
    3. mobile devices
    4. user studies

    Qualifiers

    • Article

    Conference

    TEI07
    Sponsor:
    • CCT
    TEI07: Tangible and Embedded Interaction 2007
    February 15 - 17, 2007
    Louisiana, Baton Rouge

    Acceptance Rates

    Overall Acceptance Rate 393 of 1,367 submissions, 29%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)21
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 16 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Make Interaction Situated: Designing User Acceptable Interaction for Situated Visualization in Public EnvironmentsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642049(1-21)Online publication date: 11-May-2024
    • (2024)A visual programming tool for mobile web augmentationKnowledge and Information Systems10.1007/s10115-023-02039-666:9(5631-5668)Online publication date: 23-May-2024
    • (2023)Surveying the Social Comfort of Body, Device, and Environment-Based Augmented Reality Interactions in Confined Passenger Spaces Using Mixed Reality Composite VideosProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109237:3(1-25)Online publication date: 27-Sep-2023
    • (2021)Exploring Social Acceptability and Users’ Preferences of Head- and Eye-Based Interaction with Mobile DevicesProceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia10.1145/3490632.3490636(12-23)Online publication date: 5-Dec-2021
    • (2021)PocketView: Through-Fabric Information DisplaysThe 34th Annual ACM Symposium on User Interface Software and Technology10.1145/3472749.3474766(511-523)Online publication date: 10-Oct-2021
    • (2021)EyeMU Interactions: Gaze + IMU Gestures on Mobile DevicesProceedings of the 2021 International Conference on Multimodal Interaction10.1145/3462244.3479938(577-585)Online publication date: 18-Oct-2021
    • (2021)Probing User Perceptions of On-Skin Notification DisplaysProceedings of the ACM on Human-Computer Interaction10.1145/34329434:CSCW3(1-20)Online publication date: 5-Jan-2021
    • (2021)Project Tasca: Enabling Touch and Contextual Interactions with a Pocket-based Textile SensorProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445712(1-13)Online publication date: 6-May-2021
    • (2021)FaceSight: Enabling Hand-to-Face Gesture Interaction on AR Glasses with a Downward-Facing Camera VisionProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445484(1-14)Online publication date: 6-May-2021
    • (2021)Acceptability of Speech and Silent Speech Input Methods in Private and PublicProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445430(1-13)Online publication date: 6-May-2021
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media