skip to main content
10.1145/1622176.1622197acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Disappearing mobile devices

Published: 04 October 2009 Publication History

Abstract

In this paper, we extrapolate the evolution of mobile devices in one specific direction, namely miniaturization. While we maintain the concept of a device that people are aware of and interact with intentionally, we envision that this concept can become small enough to allow invisible integration into arbitrary surfaces or human skin, and thus truly ubiquitous use. This outcome assumed, we investigate what technology would be most likely to provide the basis for these devices, what abilities such devices can be expected to have, and whether or not devices that size can still allow for meaningful interaction. We survey candidate technologies, drill down on gesture-based interaction, and demonstrate how it can be adapted to the desired form factors. While the resulting devices offer only the bare minimum in feedback and only the most basic interactions, we demonstrate that simple applications remain possible. We complete our exploration with two studies in which we investigate the affordance of these devices more concretely, namely marking and text entry using a gesture alphabet.

References

[1]
Baudisch, P and Chu, G. (2009). Back-of-device interaction allows creating very small touch devices. In Proc. CHI'09, pp. 1923--1932.
[2]
Brumitt, B., Meyers, B., Robbins, D., Krumm, J., Czerwinski, M., and Shafer, S. (1998). The New EasyLiving Project at Microsoft Research. Proc. Joint DARPA/NIST Smart Spaces Workshop '98.
[3]
Butler, A., Izadi, S., and Hodges, S. (2008). SideSight: multi-"touch" interaction around small devices. Proc. UIST '08, pp. 201--204.
[4]
Cao, X., A. Wilson, R. Balakrishnan, K. Hinckley, S. Hudson. (2008) ShapeTouch: Leveraging Contact Shape on Interactive Surfaces. In Proc. Tabletop'08, pp. 139--146.
[5]
Chi, E. H., Song, J., and Corbin, G. (2004). "Killer App" of wearable computing: wireless force sensing body protectors for martial arts. Proc. UIST '04, pp. 277--285.
[6]
Feiner, S., Macintyre, B., and Seligmann, D. (1993). Knowledge-based augmented reality. Commun. ACM 36, 7 (Jul. 1993), pp. 53--62.
[7]
Fitzmaurice, G., Khan, A., Pieké, R., Buxton, W., Kurtenbach, G. (2003). Tracking menus. Proc. UIST'03, pp. 71--79.
[8]
Fukumoto, M. (2005). A finger-ring shaped wearable handset based on bone-conduction. Proc. ISWC '05, pp. 10--13.
[9]
Fukumoto, M. and Tonomura, Y. (1999). Whisper: a wrist-watch style wearable handset. In Proc. CHI'99, pp. 112--119.
[10]
Fukumoto, M. and Tonomura, Y. (1997) Body-coupled FingerRing: wireless wearable keyboard. Proc. CHI97, 147--154.
[11]
Goldberg, D. and Richardson, C. (1993). Touch-typing with a stylus. Proc. INTERCHI '93, pp. 80--87.
[12]
Graffiti. http://en.wikipedia.org/wiki/Graffiti_(Palm_OS)
[13]
Harrison, C. and Hudson, S. (2008). Scratch input: creating large, inexpensive unpowered and mobile finger input surfaces. Proc. UIST '08, pp. 205--208.
[14]
Hinckley, K., Baudisch, P., Ramos, G., Guimbretiere, F. (2005). Design and analysis of delimiters for selection-action pen gesture phrases in scriboli. Proc. CHI '05, pp. 451--460.
[15]
Julien Couvreur's programming blog. Graffiti code download at http://blog.monstuff.com/archives/000012.html
[16]
Hinckley, K. (2003). Synchronous gestures for multiple persons and computers. Proc. UIST '03. pp. 149--158.
[17]
Hudson, S. (2004). Using light emitting diode arrays as touch-sensitive input and output devices. Proc. UIST'04, pp. 287--290.
[18]
Kurtenbach, G. and Buxton, W. (1994). User learning and performance with marking menus. Proc. CHI '94, pp. 258--264.
[19]
Labrune, JB. And Mackay, W. (2006). Telebeads: social network mnemonics for teenagers. Proc. Conference on Interaction design and children'06, pp. 57--64.
[20]
Lawo, M., Herzog, O., Lukowicz, P., and Witt, H. (2008). Using wearable computing solutions in real-world applications. CHI '08 Extended Abstracts, pp. 3687--3692.
[21]
Li, K. A., Baudisch, P., and Hinckley, K. (2008). Blindsight: eyes-free access to mobile phones. Proc. CHI '08, 1389--1398.
[22]
Liess, M., Weijers, G., Heinks, C., van der Horst, A., Rommers, A., Duijve, R., and Mimnagh, G. (2002). A miniaturized multidirectional optical motion sensor and input device based on laser self-mixing. Measurement Science and Technology, 13(2002), pp. 2001--2006.
[23]
Luk, J., Pasquero, J., Little, S., MacLean, K., Levesque, V., and Hayward, V. (2006). A role for haptics in mobile interaction: initial design using a handheld tactile display prototype. Proc. CHI '06, pp. 171--180.
[24]
MacKenzie, I.S. and Zhang S.X. (1997). The immediate usability of graffiti. Proc. Graphics Interface '97, pp. 129--137.
[25]
MacKenzie, I.S. and Tanaka-Ishii, K. (2007). Text entry with a small number of buttons. In MacKenzie, I. S., and Tanaka--Ishii, K. (Eds.) Text entry systems: Mobility, accessibility, universality, pp. 105--121. San Francisco, CA: Morgan Kaufmann.
[26]
Mann, S. (1996). "Smart clothing": wearable multimedia computing and "personal imaging" to restore the balance between people and their intelligent environments. Proc. Multi-media '96, pp. 163--174.
[27]
Matias, E., MacKenzie, I. S., and Buxton, W. (1996). A wearable computer for use in microgravity space and other non-desktop environments. Proc. CHI '96, pp. 69--70.
[28]
Miner, C.S., Chan, D.M., and Campbell, C. (2001). Digital jewelry: wearable technology for everyday life. CHI '01 Extended Abstracts, pp. 45--46.
[29]
Morse code. http://en.wikipedia.org/wiki/Morse_code
[30]
Moscovich, T. and Hughes, J. F. (2004). Navigating documents with the virtual scroll ring. Proc. UIST '04, pp. 57--60.
[31]
Narayanaswami, C., Kamijoh, N., Raghunath, M., Inoue, T., Cipolla, T., Sanford, J., Schlig, E., Venkiteswaran, S., Guniguntala, D., Kulkarni, V., and Yamazaki, K. (2002). IBM's linux watch: the challenge of miniaturization. IEEE Computer, 35(1), January, 2002, pp. 33--41.
[32]
Olwal, A., Feiner, S., and Heyman, S. (2008). Rubbing and tapping for precise and rapid selection on touch-screen displays. Proc. CHI '08, pp. 295--304.
[33]
Popov, P., Pulov, S., and Pulov, V. (2004). A laser speckle pattern technique for designing an optical computer mouse. Opt Laser Eng, 42 (2004), pp. 21--26.
[34]
Potter, R., Weldon, L., Shneiderman, B. (1988). Improving the accuracy of touch screens: an experimental evaluation of three strategies. Proc. CHI '88, pp. 27--32.
[35]
Ramos, G., Boulos, M., and Balakrishnan, R. (2004). Pressure widgets. Proc. CHI '04, pp. 487--494.
[36]
Rekimoto, J. (2001). GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices. Proc. ISWC '01, pp. 21.
[37]
Russell, D. M., Streitz, N. A., and Winograd, T. (2005). Building disappearing computers. Commun. ACM 48, 3 (Mar. 2005), pp. 42--48.
[38]
Sawant, A.P. and Healey, C.G. (2005). A survey of display device properties and visual acuity for visualization. TR-2005-32, North Carolina State University.
[39]
Schwesig, C., Poupyrev, I., and Mori, E. (2004). Gummi: a bendable computer. Proc. CHI '04, pp. 263--270.
[40]
Starner, T. (2002). The role of speech input in wearable computing. Pervasive Computing, 1(3), pp. 89--93.
[41]
Starner, T., Auxier, J., and Ashbrook, D. (2000). The gesture pendant: a self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring. Proc. International Symposium on Wearable Computing '00, pp. 87--94.
[42]
Vogel, D. and Baudisch, P. (2007). Shift: A Technique for Operating Pen-Based Interfaces Using Touch. Proc. CHI'07, pp. 657--666.
[43]
Weiser, M. (1991). The Computer for the 21st Century. Scientific American Special Issue on Communications, Computers, and Networks, September, 1991.
[44]
Werner, J., Wettach, R., and Hornecker, E. (2008). United-pulse: feeling your partner's pulse. Proc. MobileHCI '08, pp. 535--538.
[45]
Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., and Shen, C. (2007). LucidTouch: a see-through mobile device. Proc. UIST '07, pp. 269--278.
[46]
Wilson, A.D., Cutrell, E. (2005). FlowMouse: a computer vision-based pointing and gesture input device. Proc. INTERACT'05, pp. 565--578.
[47]
Wobbrock, J. and Myers, B. (2006). Trackball text entry for people with motor impairments. Proc. CHI '06, pp. 479--488.
[48]
Wobbrock, J. O. and Myers, B. A. (2005). Gestural text entry on multiple devices. Proc. Assets '05, pp. 184--185.
[49]
Wobbrock, J.O., Myers, B.A., and Kembel, J.A. (2003). EdgeWrite: a stylus-based text entry method designed for high-accuracy and stability of motion. Proc. UIST '03, 61--70.
[50]
Yatani, K. and Truong, K. N. (2007). An evaluation of stylus-based text entry methods on handheld devices in stationary and mobile settings. Proc. MobileHCI '07, pp. 487--494.
[51]
Zhao, S. and Balakrishnan, R. (2004). Simple vs. compound mark hierarchical marking menus. Proc. UIST '04, pp. 33--42.
[52]
Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R., and Baudisch, P. (2007). Earpod: eyes-free menu selection using touch input and reactive audio feedback. Proc. CHI '07, pp. 1395--1404.

Cited By

View all
  • (2024)Free-Hand Input and Interaction in Virtual Reality Using a Custom Force-Based Digital ThimbleApplied Sciences10.3390/app14231101814:23(11018)Online publication date: 27-Nov-2024
  • (2023)Can You Ear Me?Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109257:3(1-23)Online publication date: 27-Sep-2023
  • (2022)Shapeshifter: Gesture Typing in Virtual Reality with a Force-based Digital ThimbleExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3519679(1-9)Online publication date: 27-Apr-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UIST '09: Proceedings of the 22nd annual ACM symposium on User interface software and technology
October 2009
278 pages
ISBN:9781605587455
DOI:10.1145/1622176
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 October 2009

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gesture
  2. input device
  3. interaction technique
  4. miniaturization
  5. mobile device
  6. sensor
  7. ubicomp
  8. wearable

Qualifiers

  • Research-article

Conference

UIST '09

Acceptance Rates

Overall Acceptance Rate 561 of 2,567 submissions, 22%

Upcoming Conference

UIST '25
The 38th Annual ACM Symposium on User Interface Software and Technology
September 28 - October 1, 2025
Busan , Republic of Korea

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)23
  • Downloads (Last 6 weeks)6
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Free-Hand Input and Interaction in Virtual Reality Using a Custom Force-Based Digital ThimbleApplied Sciences10.3390/app14231101814:23(11018)Online publication date: 27-Nov-2024
  • (2023)Can You Ear Me?Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109257:3(1-23)Online publication date: 27-Sep-2023
  • (2022)Shapeshifter: Gesture Typing in Virtual Reality with a Force-based Digital ThimbleExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3519679(1-9)Online publication date: 27-Apr-2022
  • (2021)Flashpen: A High-Fidelity and High-Precision Multi-Surface Pen for Virtual Reality2021 IEEE Virtual Reality and 3D User Interfaces (VR)10.1109/VR50410.2021.00053(306-315)Online publication date: Mar-2021
  • (2020)Keep the Phone in Your PocketProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/33973084:2(1-23)Online publication date: 15-Jun-2020
  • (2020)WristLensProceedings of the Augmented Humans International Conference10.1145/3384657.3384797(1-8)Online publication date: 16-Mar-2020
  • (2019)GehnaProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300751(1-12)Online publication date: 2-May-2019
  • (2019)An Acceptance Model for the Adoption of Smart Glasses Technology by Healthcare ProfessionalsInternational Business and Emerging Economy Firms10.1007/978-3-030-27285-2_6(163-194)Online publication date: 2-Nov-2019
  • (2018)!FTL, an Articulation-Invariant Stroke Gesture Recognizer with Controllable Position, Scale, and Rotation InvariancesProceedings of the 20th ACM International Conference on Multimodal Interaction10.1145/3242969.3243032(125-134)Online publication date: 2-Oct-2018
  • (2017)When Smart Devices Interact With Pervasive ScreensACM Transactions on Multimedia Computing, Communications, and Applications10.1145/311593313:4(1-23)Online publication date: 12-Aug-2017
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media