skip to main content
article

Token+constraint systems for tangible interaction with digital information

Published: 01 March 2005 Publication History

Abstract

We identify and present a major interaction approach for tangible user interfaces based upon systems of tokens and constraints. In these interfaces, tokens are discrete physical objects which represent digital information. Constraints are confining regions that are mapped to digital operations. These are frequently embodied as structures that mechanically channel how tokens can be manipulated, often limiting their movement to a single degree of freedom. Placing and manipulating tokens within systems of constraints can be used to invoke and control a variety of computational interpretations.We discuss the properties of the token+constraint approach; consider strengths that distinguish them from other interface approaches; and illustrate the concept with eleven past and recent supporting systems. We present some of the conceptual background supporting these interfaces, and consider them in terms of Bellotti et al.'s [2002] five questions for sensing-based interaction. We believe this discussion supports token+constraint systems as a powerful and promising approach for sensing-based interaction.

References

[1]
Ahlberg, C. and Shneiderman, B. 1994. Visual information seeking: Tight coupling of dynamic query filters with starfield displays. In Proceedings of Computer-Human Interaction 1994. 313--317.]]
[2]
Aish, R. and Noakes, P. 1984. Architecture without numbers---CAAD based on a 3D modelling system. In Computer-Aided Design 16, 6 (Nov.) 321--328.]]
[3]
Anagnostou, G., Dewey, D., and Patera, A. 1989. Geometry-defining processors for engineering design and analysis. In The Visual Computer. Chapter 5, 304--315.]]
[4]
Anderson, D., Frankel, J. Marks, J., Agarwala, A., Beardsley, P., Hodgins, J., Leigh, D., Ryall, K., Solliva, E., and Yedidia, J. 2000. Tangible interaction+graphical interpretation: A new approach to 3D modelling. In Computer Graphics Proceedings (SIGGRAPH'00), 393--402.]]
[5]
Ballagas, R., Ringel, M., Stone, M., and Borchers, J. 2003. iStuff: A physical user interface toolkit for ubiquitous computing environments. In Proceedings of Computer-Human Interaction 2003. 537--544.]]
[6]
Bell, R. 1979. Board and Table Games from Many Civilizations. Dover Publications, New York, NY.]]
[7]
Bellotti, V., Back, M., Edwards, Grinter, R., Henderson, A., and Lopes, C. 2002. Making sense of sensing systems: Five questions for designers and researchers. In Proceedings of Computer-Human Interaction 2002. 415--422.]]
[8]
Calvillo-Gámez, E., Leland, N., Shaer, O., and Jacob, R. 2003. The TAC paradigm: Unified conceptual framework to represent Tangible User Interfaces. In Proceedings of Latin American Conference on Human-Computer Interaction. 9--15.]]
[9]
Cohen, J., Withgott, M., and Piernot, P. 1999. Logjam: A Tangible Multi-Person Interface for Video Logging. In Proceedings of Computer-Human Interaction 1999. 128--135.]]
[10]
Cutkosky, M. and Howe, R. 1990. Human grasp choice and robotic grasp analysis. In Dextrous Robot Hands. 5--31. Springer Verlag.]]
[11]
Durham, J. 2002a. Abrasives, trust, and how the Abacus got its name. http://bart.cba.nau. edu/∼durham-j/newsite/id153.htm {Feb}.]]
[12]
Durham, J. 2002b. Personal communications. Feb. 10, 2002.]]
[13]
Fernandes, L. 2001. The abacus: The art of calculating with beads. http://www.ee.ryerson. ca:8080/∼elf/abacus/ {Feb. 2002}.]]
[14]
Fitzmaurice, G., Ishii, H., and Buxton, W. 1995. Bricks: Laying the foundations for graspable user interfaces. In Proceedings of Computer-Human Interaction 1995. 442--449.]]
[15]
Fjeld, M., Bichsel, M., and Rauterberg, M. 1998. BUILD-IT: An intuitive design tool based on direct object manipulation. In Gesture and Sign Language in Human-Computer Interaction, v. 1371, Wachsmut and Fröhlich, Eds. Springer-Verlag, Berlin, Germany. 297--308.]]
[16]
Frazer, J. 1995. An Evolutionary Architecture. Architectural Association. London, UK.]]
[17]
Frazer J. H., Frazer J. M., and Frazer P. A. 1989. Intelligent physical three-dimensional modelling systems. Computer Graphics' 80 Conference, Conference Proceedings, Online Publications. 359--370.]]
[18]
Gellersen, H., Schmidt, A., and Beigl, M. 2002. Multi-sensor context-awareness in mobile devices and smart artifacts. In Mobile Netw. Applica. 1, 5, 341--351.]]
[19]
Gibson, J. 1979. The Ecological Approach to Visual Perception. Erlbaum Associates, New York, NY.]]
[20]
Guiard, Y. 1987. Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model. J. Motor Behav. 19, 4, 486--517.]]
[21]
Hinckley, K., Pausch, R., Proffitt, D., and Kassell, N. 1998. Two-handed virtual manipulation. ACM Trans. Comput.-Hum. Inter. 260--302.]]
[22]
Holmquist, L., RedströM, J., and Ljungstrand, P. 1999. Token-based access to digital information. In Proceedings of Handheld and Ubiquitous Computing (HUC 99), 234--245.]]
[23]
Hornecker, E. 2002. Understanding the benefits of graspable interfaces for cooperative use. In Proceedings of Cooperative Systems Design 2002. 71--87.]]
[24]
Ifrah, G. 2001. The Universal History of Computing: From the Abacus to the Quantum Computer. John Wiley & Sons, New York, NY.]]
[25]
Ishii, H., Underkoffler, J., Chak, D., Piper, B., Ben-JOSEPH, E., Yeung, L., and Kanji, Z. 2002. Augmented urban planning workbench: Overlaying drawings, physical models and digital simulation. In Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR '02). 203--214.]]
[26]
Ishii, H. and Ullmer, B. 1997. Tangible bits: Towards seamless interfaces between people, bits, and atoms. In Proceedings of Computer-Human Interaction 1997. 234--241.]]
[27]
Jacob, R., Ishii, H., Pangaro, G., and Patten, J. 2002. A tangible interface for organizing information using a grid. In Proceedings of Computer-Human Interaction 2002. 339--346.]]
[28]
Jones, W. and Dumais, S. 1986. The spatial metaphor for user interfaces: Experimental tests of reference by location versus name. ACM Trans. Office Inf. Syst. 4, 1 (Jan.) 42--63.]]
[29]
Kirsh, D. 1995. The intelligent use of space. Artif. Intel.]]
[30]
Klemmer, S. 2003. Papier-Mâhé: Toolkit support for tangible interaction. In Proceedings of User Interface Software and Technology 1995.]]
[31]
Larkin, J. and Simon, H. 1987. Why a diagram is (sometimes) worth ten thousand words. Cognit. Sci. 11, 65--99.]]
[32]
Lütjens, J. 2002. Abacus online museum. http://www.joernluetjens.de/sammlungen/abakus/abakus-en.htm {(Feb.) 2002}.]]
[33]
Maclean, K., Snibbe, S., and Levin, G. 2000. Tagged handles: Merging discrete and continuous manual control. In Proceedings of Computer-Human Interaction 2000. 225--232.]]
[34]
Masters, J. 2002. The royal game of Ur and Tau. http://www.tradgames.org.uk/games/Royal-Game-Ur.htm {(Aug.) 2002}.]]
[35]
Mazalek, A. and Jehan, T. 2000. Interacting with music in a social setting. In Extended Abstracts of Computer-Human Interaction 2000. 255--256.]]
[36]
Mcnerney, T. 2000. Tangible programming bricks: An approach to making programming accessible to everyone. MS Thesis, MIT Media Laboratory.]]
[37]
Nelson, L., Ichimura, S., Pederson, E., and Adams, L. 1999. Palette: A paper interface for giving presentations. In Proceedings of Computer-Human Interaction 1999. 354--361.]]
[38]
Neurosmith. 1999. MusicBlocks product. http://www.neurosmith.com/.]]
[39]
Norman, D. 1999. Affordances, conventions, and design. In Interact. 6, 3, 38--43.]]
[40]
Norman, D. 1993. Things that Make Us Smart. Addison-Wesley, Reading, MA.]]
[41]
Oxford English Dictionary (OED). 1989. OED Online Oxford University Press.]]
[42]
Pangaro, G., Maynes-Aminzade, D., and Ishii, H. 2002. The actuated workbench: Computer-controlled actuation in tabletop tangible interfaces. In Proceedings of User Interface Software and Technology 2002. 181--190.]]
[43]
Patten, J., Recht, B., and Ishii, H. 2002. AudioPad: A tag-based interface for musical performance. In Proceedings of the International Conference on New Interfaces For Musical Expression.]]
[44]
Patten, J., Ishii, H., Hines, J., and Pangaro, G. 2001. Sensetable: A wireless object tracking platform for tangible user interfaces. In Proceedings of Computer-Human Interaction 2001. 253--260.]]
[45]
Perlman, R. 1976. Using computer technology to provide a creative learning environment for preschool children. MIT Logo Memo #24.]]
[46]
Petre, M. 1995. Why looking isn't always seeing: Readership skills and graphical programming. Comm. ACM, 38 (June), 33--44.]]
[47]
Polynor, R. 1995. The hand that rocks the cradle. I.D. (May/June), 60--65.]]
[48]
Preece, J., Rogers, Y., and Sharp, H. 2002. Interaction Design. John Wiley and Sons. New York, NY.]]
[49]
Redström, J. 2001. Designing everyday computational things. Ph.D. thesis, Göteberg University.]]
[50]
Rekimoto, J., Ullmer, B., and Oba, H. 2001. DataTiles: A modular platform for mixed physical and graphical interactions. In Proceedings of Computer-Human Interaction 2001. 269--276.]]
[51]
Retz-Schmidt, G. 1988. Various views on spatial prepositions. AI Magazine, 9, 2. 95--105.]]
[52]
Scaife, M. and Rogers, Y. 1996. External cognition: How do graphical representations work? Int. J. Hum.-Comput. Stud. 45, 2, 185--213.]]
[53]
Schäfer, K., Brauer, V., and Bruns, W. 1997. A new approach to human-computer interaction---synchronous modelling in real and virtual spaces. In Proceedings of Designing Interactive Systems 1997. 335--344.]]
[54]
Schieβl, S. 2002. Digital cubes. http://www.aec.at/festival2002/texte/schieβl_e.asp.]]
[55]
Schmandt-Besserat, D. 1997. How Writing Came About. University of Texas Press, Austin, TX.]]
[56]
Shneiderman, B. 1983. Direct manipulation: A step beyond programming languages. IEEE Comput. 16, 8, 57--69.]]
[57]
Singer, A., Hindus, D., Stifelman, L., and White, S. 1999. Tangible progress: Less is more in somewire audio spaces. In Proceedings of Computer-Human Interaction 1999. 104--111.]]
[58]
Smith, D. 1975. Pygmalion: A creative programming environment. Ph.D. thesis, Stanford University.]]
[59]
Suzuki, H. and Kato, H. 1993. AlgoBlock: A tangible programming language, a tool for collaborative learning. In Proceedings of 4th European Logo Conference. 297--303.]]
[60]
Ten Hagen, P. 1981. Interaction and syntax. Int. J. Man-Mach. Stud. 15.]]
[61]
Tomoe Soroban Co., Ltd. 2002. Soroban museum: Roman Soroban. http://www.soroban. com/museum/∼5s_eng.html {Feb. 2002}.]]
[62]
Ullmer, B., Ishii, H., and Jacob, R. 2003. Tangible query interfaces: Physically constrained tokens for manipulating database queries. To appear in Proceedings of International Conference on Computer-Human Interaction 2003.]]
[63]
Ullmer, B. 2002. Tangible interfaces for manipulating aggregates of digital information. Ph.D. dissertation, MIT Media Laboratory.]]
[64]
Ullmer, B., and Ishii, H. 2001. Emerging Frameworks for Tangible User Interfaces. In HCI in the New Millenium, John M. Carroll, Ed. 579--601.]]
[65]
Ullmer, B. and Ishii, H. 1997. The metaDESK: Models and prototypes for tangible user interfaces. In Proceedings of User Interface Software and Technology 1997. 223--232.]]
[66]
Ullmer, B., Ishii, H., and Glas, D. 1998. mediaBlocks: Physical containers, transports, and controls for online media. In Computer Graphics Proceedings (SIGGRAPH'98). 379--386.]]
[67]
Underkoffler, J., Ullmer, B., and Ishii, H. 1999. Emancipated pixels: Real-world graphics in the luminous room. In Computer Graphics Proceedings (SIGGRAPH'99). 385--392.]]
[68]
Want, R. and Russell, D. 2000. Ubiquitous electronic tagging. In IEEE Distrib. Syst. Online 1, 2 (Sept.).]]
[69]
Yarin, P. and Ishii, H. 1999. TouchCounters: Designing interactive electronic labels for physical containers. In Proceedings of Computer-Human Interaction 1999. 362--368.]]
[70]
Zhang, J. 1997. The nature of external representations in problem solving. Cogn. Sci. 21, 2, 179--217.]]
[71]
Zhang, J. and Norman, D. 1994. Representations in distributed cognitive tasks. Cogn. Sci. 18, 87--122.]]

Cited By

View all
  • (2024)Efficient Fabrication Workflow for NFC-Based Identifiable Building BlocksAdjunct Proceedings of the 9th ACM Symposium on Computational Fabrication10.1145/3665662.3673264(1-3)Online publication date: 7-Jul-2024
  • (2024)Variations on a Hexagon: Iterative Design of Interactive Cyberphysical Tokens and ConstraintsProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3633354(1-17)Online publication date: 11-Feb-2024
  • (2024)Input Visualization: Collecting and Modifying Data with Visual RepresentationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642808(1-18)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Computer-Human Interaction
ACM Transactions on Computer-Human Interaction  Volume 12, Issue 1
March 2005
146 pages
ISSN:1073-0516
EISSN:1557-7325
DOI:10.1145/1057237
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 March 2005
Published in TOCHI Volume 12, Issue 1

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Tangible interfaces
  2. token+constraint interfaces

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)106
  • Downloads (Last 6 weeks)14
Reflects downloads up to 15 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Efficient Fabrication Workflow for NFC-Based Identifiable Building BlocksAdjunct Proceedings of the 9th ACM Symposium on Computational Fabrication10.1145/3665662.3673264(1-3)Online publication date: 7-Jul-2024
  • (2024)Variations on a Hexagon: Iterative Design of Interactive Cyberphysical Tokens and ConstraintsProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3633354(1-17)Online publication date: 11-Feb-2024
  • (2024)Input Visualization: Collecting and Modifying Data with Visual RepresentationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642808(1-18)Online publication date: 11-May-2024
  • (2024)Pointing gestures accelerate collaborative problem-solving on tangible user interfacesJournal on Multimodal User Interfaces10.1007/s12193-024-00448-7Online publication date: 27-Dec-2024
  • (2023)Beyond Applications: Interaction Substrates and InstrumentsProceedings of the 34th Conference on l'Interaction Humain-Machine10.1145/3583961.3583968(1-15)Online publication date: 3-Apr-2023
  • (2023)Edo: A Participatory Data Physicalization on the Climate Impact of Dietary ChoicesProceedings of the Seventeenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3569009.3572807(1-13)Online publication date: 26-Feb-2023
  • (2023)Mix & Match Machine Learning: An Ideation Toolkit to Design Machine Learning-Enabled SolutionsProceedings of the Seventeenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3569009.3572739(1-18)Online publication date: 26-Feb-2023
  • (2023)ThrowIO: Actuated TUIs that Facilitate “Throwing and Catching” Spatial Interaction with Overhanging Mobile Wheeled RobotsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581267(1-17)Online publication date: 19-Apr-2023
  • (2023)Crafting Interactive Circuits on Glazed Ceramic WareProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580836(1-18)Online publication date: 19-Apr-2023
  • (2023)Exploring the experience with tangible interactive narrative: Authoring and evaluation of Letters to JoséEntertainment Computing10.1016/j.entcom.2022.10053544(100535)Online publication date: Jan-2023
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media