skip to main content
10.1145/1357054.1357076acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Exploring the use of tangible user interfaces for human-robot interaction: a comparative study

Published: 06 April 2008 Publication History

Abstract

In this paper we suggest the use of tangible user interfaces (TUIs) for human-robot interaction (HRI) applications. We discuss the potential benefits of this approach while focusing on low-level of autonomy tasks. We present an experimental robotic interaction test bed to support our investigation. We use the test bed to explore two HRI-related task-sets: robotic navigation control and robotic posture control. We discuss the implementation of these two task-sets using an AIBO" robot dog. Both tasks were mapped to two different robotic control interfaces: keypad interface which resembles the interaction approach currently common in HRI, and a gesture input mechanism based on Nintendo Wii" game controllers. We discuss the interfaces implementation and conclude with a detailed user study for evaluating these different HRI techniques in the two robotic tasks-sets.

References

[1]
ADXL330, Analog Devices. http://www.analog.com/ UploadedFiles/Data_Sheets/ADXL330.pdf
[2]
Balakrishnan, R., Hinckley, K. Symmetric bimanual interaction. In Proc. CHI 2000, ACM Press (2000), 33--40.
[3]
Baudel, T., Baudouin-Lafon, M. Charade: Remote Control of Objects Using Free-Hand Gestures. Communication of the. ACM, vol. 36, no. 7, pp. 28--35, 1993.
[4]
Bluethmann, W., Ambrose, R., Diftler, M., Askew, S., Huber, E., Goza, M., Rehnmark, F., Lovchik, C., and Magruder, D. Robonaut: A robot designed to work with humans in space. Autonomous Robots,2003, 14(2-3): 179--197.
[5]
Dourish, P. Where the Action is: The Foundations of Embodied Interaction. MIT Press, Cambridge, MA, USA, 2001.
[6]
Drury, J. L., Scholtz, J., Yanco, H.A. Awareness in human-robot interactions. In Proc. IEEE International Conference on Systems, Man and Cybernetics 2003.
[7]
Faisal, S., Cairns, P., and Craft, B. Infoviz experience enhancement through mediated interaction. In Proc. ICMI 2005 Workshop on Multimodal interaction for the visualization and exploration of scientific Data (2005), 3--9.
[8]
Fitzmaurice, G.W., Buxton, W. An empirical evaluation of graspable user interfaces: towards specialized, space-multiplexed input. In Proc. SIGCHI Conference on Human Factors in Computing Systems, ACM Press (1997), 43--50.
[9]
Fitzmaurice, G.W., Ishii, H., and Buxton, W. Bricks: Laying the Foundations for Graspable User Interfaces. In Proc. CHI 1995. ACM Press (1995), 317--324.
[10]
Goodrich, M., and Olsen, D. Seven principles of efficient human robot interaction. In Proc. IEEE International Conference on Systems, Man and Cybernetics, 2003, 3943--3948.
[11]
Hasanuzzaman, Md., Zhang, T., Ampornaramveth, V., Kiatisevi, P., Shirai, Y., and Ueno, H. Gesture Based Human-Robot Interaction Using a Frame Based Software Platform. In Proc. IEEE International Conference on Systems, Man and Cybernetics, IEEE SMC(2004), 2883--2888.
[12]
Ishii, H. and Ullmer, B. Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms. In Proc. CHI 1997, ACM Press (1997), 234--241.
[13]
Kazerooni, H. Human-Robot Interaction via the Transfer of Power and Information Signals. IEEE Trans. on System and Cybernetics (1990) Vol.20, No. 2, 450--463.
[14]
Kortenkamp, D., Huber, E., and Bonasso, R. Recognizing and interpreting gestures on a mobile robot. In Proc. AAAI 1996.
[15]
Makinson, B.J. Research and development prototype for machine augmentation of human strength and endurance, Handiman I project. 1971, Rep. S-71-1056, General Electric Schenectady, NY.
[16]
Managed Library for Nintendo's Wiimote, http://blogs. sdn.com/coding4fun/archive/2007/03/14/1879033.aspx
[17]
McNeill, D. Hand and Mind: What Gestures Reveal about Thought. University of Chicago Press (1992).
[18]
Messing, L., Campbell, R.Gesture, speech, and sign. New York: Oxford University Press (1999), 227.
[19]
Norman, D.A. 1988. The Psychology of Everyday Things. BasicBooks.
[20]
Quigely, M., Goodrich, M., Beard, R. Semi-Autonomous Human-UAV Interfaces for Fixed-Wing Mini-UAVs. In Proc.IROS 2004.
[21]
Raffle, H., Parkes, A. Ishii, H. Topobo: A Constructive Assembly System with Kinetic Memory. In Proc. CHI 2004, ACM Press (2004), 647 -- 654.
[22]
Richer, J., Drury J.L. A Video Game-Based Framework for Analyzing Human-Robot Interaction: Characterizing Interface Design in Real-Time Interactive Multimedia Applications. In Proc. HRI 2006, ACM Press (2006), 266--273.
[23]
Sharlin, E., Watson, B.A., Kitamura, Y., Kishino, F. and Itoh, Y. "On Tangible User Interfaces, Humans and Spatiality", Personal and Ubiquitous Computing, Springer-Verlag, 2004, 338--346.
[24]
Sturman, D., Zeltzer, D. A Survey of Glove-Based Input. IEEE CG&A (1994), Vol. 14, No. 1, 30--39.
[25]
Tekkotsu framework, http://www.cs.cmu.edu/~tekkotsu/
[26]
Wiimote Motion Analysis, http://www.wiili.org/index.php/Motion_analysis
[27]
Wiimote Motion Sensor, http://www.wiili.org/index.php/Wiimote#Motion_Sensor
[28]
Yanco, H.A., Drury, J.L. and Scholtz, J. Beyond Usability Evaluation: Analysis of Human-Robot Interaction at a Major Robotics Competition. Journal of Human-Computer Interaction (2004), Volume 19, Numbers 1 and 2, pp. 117--149.
[29]
Yanco, H.A., Drury, J. Classifying Human-Robot Interaction: An Updated Taxonomy. IEEE Conference on SMC, 2004

Cited By

View all
  • (2024)A Novel Measurement Method for Performance Assessment of Hands-Free, XR-Based Human–Machine InterfacesIEEE Sensors Journal10.1109/JSEN.2024.344447224:19(31054-31061)Online publication date: 1-Oct-2024
  • (2023)Teaching Assembly Tasks to Robots Using a 3D Simulation EnvironmentProceedings of the 2nd International Conference of the ACM Greek SIGCHI Chapter10.1145/3609987.3610005(1-8)Online publication date: 27-Sep-2023
  • (2022)A Decision Support Design Framework for Selecting a Robotic InterfaceProceedings of the 10th International Conference on Human-Agent Interaction10.1145/3527188.3561913(104-113)Online publication date: 5-Dec-2022
  • Show More Cited By

Index Terms

  1. Exploring the use of tangible user interfaces for human-robot interaction: a comparative study

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '08: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    April 2008
    1870 pages
    ISBN:9781605580111
    DOI:10.1145/1357054
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 April 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gesture input
    2. human-robot interaction
    3. tangible user interface

    Qualifiers

    • Research-article

    Conference

    CHI '08
    Sponsor:

    Acceptance Rates

    CHI '08 Paper Acceptance Rate 157 of 714 submissions, 22%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)150
    • Downloads (Last 6 weeks)11
    Reflects downloads up to 09 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)A Novel Measurement Method for Performance Assessment of Hands-Free, XR-Based Human–Machine InterfacesIEEE Sensors Journal10.1109/JSEN.2024.344447224:19(31054-31061)Online publication date: 1-Oct-2024
    • (2023)Teaching Assembly Tasks to Robots Using a 3D Simulation EnvironmentProceedings of the 2nd International Conference of the ACM Greek SIGCHI Chapter10.1145/3609987.3610005(1-8)Online publication date: 27-Sep-2023
    • (2022)A Decision Support Design Framework for Selecting a Robotic InterfaceProceedings of the 10th International Conference on Human-Agent Interaction10.1145/3527188.3561913(104-113)Online publication date: 5-Dec-2022
    • (2022)Design and Implementation of Telemarketing Robot with Emotion Identification for Human-Robot Interaction2022 Sixth IEEE International Conference on Robotic Computing (IRC)10.1109/IRC55401.2022.00037(177-180)Online publication date: Dec-2022
    • (2021)A Rate-based Drone Control with Adaptive Origin Update in Telexistence2021 IEEE Virtual Reality and 3D User Interfaces (VR)10.1109/VR50410.2021.00108(807-816)Online publication date: Mar-2021
    • (2021)IoT Based Voice Controlled Autonomous Robotic Vehicle Through Google Assistant2021 3rd International Conference on Advances in Computing, Communication Control and Networking (ICAC3N)10.1109/ICAC3N53548.2021.9725526(713-717)Online publication date: 17-Dec-2021
    • (2021)Determinants of trust in human-robot interaction: Modeling, measuring, and predictingTrust in Human-Robot Interaction10.1016/B978-0-12-819472-0.00004-6(85-121)Online publication date: 2021
    • (2020)Designing a humanoid robot integrated Exer-Learning-Interaction(ELI)Procedia Computer Science10.1016/j.procs.2020.03.363167(1524-1532)Online publication date: 2020
    • (2020)Evaluating a Mouse-Based and a Tangible Interface Used for Operator Intervention on Two Autonomous RobotsHuman-Computer Interaction. Multimodal and Natural Interaction10.1007/978-3-030-49062-1_46(668-678)Online publication date: 10-Jul-2020
    • (2020)A Framework of Input Devices to Support Designing Composite Wearable ComputersHuman-Computer Interaction. Multimodal and Natural Interaction10.1007/978-3-030-49062-1_28(401-427)Online publication date: 10-Jul-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media