skip to main content
10.5555/1555880.1555910guideproceedingsArticle/Chapter ViewAbstractPublication PagesgiConference Proceedingsconference-collections
research-article
Free access

Determining the benefits of direct-touch, bimanual, and multifinger input on a multitouch workstation

Published: 25 May 2009 Publication History

Abstract

Multitouch workstations support direct-touch, bimanual, and multifinger interaction. Previous studies have separately examined the benefits of these three interaction attributes over mouse-based interactions. In contrast, we present an empirical user study that considers these three interaction attributes together for a single task, such that we can quantify and compare the performances of each attribute. In our experiment users select multiple targets using either a mouse-based workstation equipped with one mouse, or a multitouch workstation using either one finger, two fingers (one from each hand), or multiple fingers. We find that the fastest multitouch condition is about twice as fast as the mouse-based workstation, independent of the number of targets. Direct-touch with one finger accounts for an average of 83% of the reduction in selection time. Bimanual interaction, using at least two fingers, one on each hand, accounts for the remaining reduction in selection time. Further, we find that for novice multitouch users there is no significant difference in selection time between using one finger on each hand and using any number of fingers for this task. Based on these observations we conclude with several design guidelines for developing multitouch user interfaces.

References

[1]
R. Balakrishnan and K. Hinckley. The role of kinesthetic reference frames in two-handed input performance. In Proc. UIST 1999, pages 171--178, 1999.
[2]
R. Balakrishnan and K. Hinckley. Symmetric bimanual interaction. In Proc. CHI 2000, pages 33--40, 2000.
[3]
W. Barnert. A comparison of one-handed and two-handed direct and indirect computer interaction. Technical report, Department of Computer Science, Tufts University, Medford, Mass., 11 2005.
[4]
H. Benko, A. Wilson, and P. Baudisch. Precise selection techniques for multitouch screens. In Proc. CHI 2006, pages 1263--1272, 2006.
[5]
E. Bier, M. Stone, K. Pier, W. Buxton, and T. DeRose. Toolglass and magic lenses: the see-through interface. In Proc. Computer Graphics and Interactive Technique, pages 73--80, 1993.
[6]
W. Buxton and B. Myers. A study in two handed input. In Proc. CHI 1986, pages 321--326, 1986.
[7]
D. Casalta, Y. Guiard, and M. Beaudouin-Lafon. Evaluating twohanded input techniques: rectangle editing and navigation. In CHI 1999 (Extended Abstracts), pages 236--237, 1999.
[8]
J. Diedrichsen, E. Hazeltine, S. Kennerley, and R. B. Ivry. Moving to directly cued locations abolishes spatial interference during bimanual actions. Psychological Science, 12(6):493--498, 2001.
[9]
R. F. Dillon, J. D. Edey, and J. W. Tombaugh. Measuring the true cost of command selection: Techniques and results. In Proc. CHI 1990, pages 19--25, 1990.
[10]
C. Forlines, D. Wigdor, C. Shen, and R. Balakrishnan. Direct-touch vs. mouse input for tabletop displays. In Proc. CHI 2007, pages 647--656, 1997.
[11]
Y. Guiard. Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a mode. Journal of Motor Behavior, 19(4):486--517, 1987.
[12]
Y. Guiard and T. Ferrand. Asymmetry in bimanual skills. In Manual asymmetries in motor performance, CRC Press., 1995.
[13]
J. Han. Low-cost multi-touch sensing through frustrated total internal reflection. In Proc. UIST 2005, pages 115--118, 2005.
[14]
M. Hancock, S. Carpendale, and A. Cockburn. Shallow depth 3d interaction: Design and evaluation of one-, two-, and three-touch techniques. In Proc. CHI 2007, pages 1147--1156, 2007.
[15]
K. Hinckley, M. Czerwinski, and M. Sinclair. Interaction and modeling techniques for desktop two-handed input. In Proc. VIST 1998, pages 49--58, 1998.
[16]
K. Hinckley, R. Pausch, D. Proffitt, and N. Kassell. Two-handed virtual manipulation. ACM Transactions on Computer-Human Interaction, 5(3):260--302, 1998.
[17]
K. Hinckley, R. Pausch, D. Proffitt, J. Patten, and N. Kassell. Cooperative bimanual action. In Proc. CHI 1997, pages 27--34, 1997.
[18]
P. Kabbash, W. Buxton, and A. Sellen. Two-handed input in a compound task. In Proc. CHI 1994, pages 417--423, 1994.
[19]
J. Karat, J. McDonald, and M. Anderson. A comparison of selection techniques: touch panel, mouse keyboard. International Journal of Man-Machine Studies, 25(1):73--92, 1986.
[20]
J. A. S. Kelso, D. L. Southard, and D. Goodman. On the coordination of two-handed movements. Journal of Experimental Psychology, 5(2):229--238, 1979.
[21]
J. A. S. Kelso, D. L. Southard, and D. Goodman. On the nature of human interlimb coordination. Journal of Experimental Psychology, 203(4384):1029--1031, 1979.
[22]
C. Latulipe, C. Kaplan, and C. Clarke. Bimanual and unimanual image alignment: an evaluation of mouse-based techniques. In Proc. UIST 2005, pages 123--131, 2005.
[23]
C. Latulipe, S. Mann, C. Kaplan, and C. Clarke. Symspline: Symmetric two-handed spline manipulation. In Proc. CHI 2006, pages 349--358, 2006.
[24]
MERL - DiamondTouch. http://www.merl.com/projects/DiamondTouch/.
[25]
Microsoft Surface. http://www.microsoft.com/surface/index.html.
[26]
M. Morris. Supporting effective interaction with tabletop groupware. PhD thesis, Stanford University, 2006.
[27]
T. Moscovich and J. Hughes. Indirect mappings of multi-touch input using one and two hands. In Proc. CHI 2008, pages 1275--1283, 2008.
[28]
D. Ostroff and B. Schneiderman. Selection devices for users of an electronic encyclopedia: An empirical comparison of four possibilities. Information Processing and Management, 24(6):665--680, 1988.
[29]
R. Owen, G. Kurtenbach, G. Fitzmaurice, T. Baudel, and W. Buxton. When it gets more difficult, use both hands: exploring bimanual curve manipulation. In Graphics Interface Conference, pages 17--24, 2005.
[30]
Perceptive Pixel. http://www.perceptivepixel.com/.
[31]
J. Rekimoto. Smartskin: An infrastructure for freehand manipulation on interactive surfaces. In Proc. CHI 2002, pages 113--120, 2002.
[32]
S. D. Scott, K. D. Grant, and R. L. Mandryk. System guidelines for colocated, collaborative work on a tabletop display. In Proc. European Conference on Computer Supported Cooperative Work, pages 159--178, 2003.
[33]
A. Sears and B. Shneiderman. High precision touchscreens: design strategies and comparisons with a mouse. International Journal of Man-Machine Studies, 34(4):593--613, 1991.
[34]
M. Wu and R. Balakrishnan. Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. In Proc. UIST 2003, pages 193--202, 2003.

Cited By

View all
  • (2024)Get Your Hands Dirty? A Comparative Study of Tool Usage and Perceptual Engagement in Physical and Digital SculptingProceedings of the 16th Conference on Creativity & Cognition10.1145/3635636.3656188(358-373)Online publication date: 23-Jun-2024
  • (2021)Bi-3D: Bi-Manual Pen-and-Touch Interaction for 3D Manipulation on TabletsThe 34th Annual ACM Symposium on User Interface Software and Technology10.1145/3472749.3474741(149-161)Online publication date: 10-Oct-2021
  • (2020)Understanding Gesture and Speech Multimodal Interactions for Manipulation Tasks in Augmented Reality Using Unconstrained ElicitationProceedings of the ACM on Human-Computer Interaction10.1145/34273304:ISS(1-21)Online publication date: 4-Nov-2020
  • Show More Cited By

Index Terms

  1. Determining the benefits of direct-touch, bimanual, and multifinger input on a multitouch workstation

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    GI '09: Proceedings of Graphics Interface 2009
    May 2009
    257 pages
    ISBN:9781568814704

    Sponsors

    • The Canadian Human-Computer Communications Society / Société Canadienne du Dialogue Humaine Machine (CHCCS/SCDHM)

    Publisher

    Canadian Information Processing Society

    Canada

    Publication History

    Published: 25 May 2009

    Author Tags

    1. bimanual input
    2. direct-touch input
    3. mouse
    4. multifinger input
    5. multitarget selection
    6. multitouch

    Qualifiers

    • Research-article

    Acceptance Rates

    GI '09 Paper Acceptance Rate 28 of 77 submissions, 36%;
    Overall Acceptance Rate 206 of 508 submissions, 41%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)55
    • Downloads (Last 6 weeks)10
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Get Your Hands Dirty? A Comparative Study of Tool Usage and Perceptual Engagement in Physical and Digital SculptingProceedings of the 16th Conference on Creativity & Cognition10.1145/3635636.3656188(358-373)Online publication date: 23-Jun-2024
    • (2021)Bi-3D: Bi-Manual Pen-and-Touch Interaction for 3D Manipulation on TabletsThe 34th Annual ACM Symposium on User Interface Software and Technology10.1145/3472749.3474741(149-161)Online publication date: 10-Oct-2021
    • (2020)Understanding Gesture and Speech Multimodal Interactions for Manipulation Tasks in Augmented Reality Using Unconstrained ElicitationProceedings of the ACM on Human-Computer Interaction10.1145/34273304:ISS(1-21)Online publication date: 4-Nov-2020
    • (2019)An experimental comparison of touch and pen gestures on a vertical displayProceedings of the 8th ACM International Symposium on Pervasive Displays10.1145/3321335.3324936(1-6)Online publication date: 12-Jun-2019
    • (2018)A multi-camera image processing and visualization system for train safety assessmentMultimedia Tools and Applications10.1007/s11042-017-4351-477:2(1583-1604)Online publication date: 1-Jan-2018
    • (2017)Mouse, Tactile, and Tangible Input for 3D ManipulationProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3025863(4727-4740)Online publication date: 2-May-2017
    • (2016)Performing universal tasks using a mini iPadProceedings of the XVII International Conference on Human Computer Interaction10.1145/2998626.2998668(1-6)Online publication date: 13-Sep-2016
    • (2016)Three-Point InteractionProceedings of the International Working Conference on Advanced Visual Interfaces10.1145/2909132.2909251(168-175)Online publication date: 7-Jun-2016
    • (2016)Partially-indirect Bimanual Input with Gaze, Pen, and Touch for Pan, Zoom, and Ink InteractionProceedings of the 2016 CHI Conference on Human Factors in Computing Systems10.1145/2858036.2858201(2845-2856)Online publication date: 7-May-2016
    • (2016)Biomechanics of Thumb Touch Gestures on Handheld DevicesProceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems10.1145/2851581.2892294(3227-3233)Online publication date: 7-May-2016
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media