ABSTRACT
T(ether) is a spatially-aware display system for multi-user, collaborative manipulation and animation of virtual 3D objects. The handheld display acts as a window into virtual reality, providing users with a perspective view of 3D data. T(ether) tracks users' heads, hands, fingers and pinching, in addition to a handheld touch screen, to enable rich interaction with the virtual scene. We introduce gestural interaction techniques that exploit proprioception to adapt the UI based on the hand's position above, behind or on the surface of the display. These spatial interactions use a tangible frame of reference to help users manipulate and animate the model in addition to controlling environment properties. We report on initial user observations from an experiment for 3D modeling, which indicate T(ether)'s potential for embodied viewport control and 3D modeling interactions.
Supplemental Material
- Agrawala, M., Beers, A., McDowall, I., Fröhlich, B., Bolas, M., and Hanrahan, P. The two-user Responsive Workbench: support for collaboration through individual views of a shared space. Proc. SIGGRAPH '97, 327--332. Google ScholarDigital Library
- Bau, O., Poupyrev, I., Israr, A., and Harrison, C. TeslaTouch: electrovibration for touch surfaces. Proc. UIST '10. 283--292. Google ScholarDigital Library
- Bowman, D., and Hodges, L., F. An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments. Proc. I3D '97, 35--39. Google ScholarDigital Library
- Fitzmaurice, G.W. Situated Information Spaces and Spatially Aware Palmtop Computers. Com. ACM (1993), 36(7), 38--49. Google ScholarDigital Library
- Grossman, T. and Balakrishnan, R.. Collaborative interaction with volumetric displays. Proc. CHI '08, 383--392. Google ScholarDigital Library
- Henrysson, A., Billinghurst, M., and Ollila, M. Virtual object manipulation using a mobile phone. Proc. ICAT '05, 164--171. Google ScholarDigital Library
- Leithinger, D., Lakatos, D., DeVincenzi, A., Blackshaw, M., and Ishii, H. Direct and gestural interaction with relief: a 2.5D shape display. Proc. UIST '11, 541--548. Google ScholarDigital Library
- McKenna, M. Interactive viewpoint control and threedimensional operations. Proc. I3D '92, 53--56. Google ScholarDigital Library
- Mine, M. R. 1996. Working in a Virtual World: Interaction Techniques Used in the Chapel Hill Immersive Modeling Program. Technical Report. Google ScholarDigital Library
- Piekarski, W., Thomas, B. H. Through-Walls Collaboration. IEEE Pervasive Computing 8(3), 42--49. Google ScholarDigital Library
- Poupyrev, I., Tomokazu, N., Weghorst, S., Virtual Notepad: handwriting in immersive VR. Proc. VR '98. 126--132. Google ScholarDigital Library
- Shaw, C. Pain and Fatigue in Desktop VR. Proc. GI '98. 185192.Google Scholar
- Spindler, M., Büschel, W., and Dachselt, R. Use Your Head: Tangible Windows for 3D Information Spaces in a Tabletop environment. Proc. ITS '12, 245--254. Google ScholarDigital Library
- Szalavári, Z., Schmalstieg, D., Fuhrmann, A. and Gervautz, M. Studierstube: An environment for collaboration in augmented reality. Virtual Reality (1998), 3:37--48.Google Scholar
- Szalavári, Zs,, Gervautz, M. The Personal Interaction Panel -- a Two-Handed Interface for Augmented Reality. Computer Graphics Forum (1997), 16(3).Google Scholar
- Tsang, M., Fitzmaurice, G, Kurtenbach, G., Khan A., and Buxton, B. Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display. Proc. UIST '02, 111--120. Google ScholarDigital Library
- Yee, K-P. Peephole displays: pen interaction on spatially aware handheld computers. Proc. CHI '03, 1--8. Google ScholarDigital Library
- Yokokohji, Y., Hollis, R. L., and Kanade, T. What you can see is what you can feel Development of a visual/haptic interface to virtual environment. Proc. VRAIS '96, 46--53. Google ScholarDigital Library
Index Terms
- T(ether): spatially-aware handhelds, gestures and proprioception for multi-user 3D modeling and animation
Recommendations
Balloon selection revisited: multi-touch selection techniques for stereoscopic data
AVI '12: Proceedings of the International Working Conference on Advanced Visual InterfacesWith the increasing distribution of multi-touch capable devices multi-touch interaction becomes more and more ubiquitous. Especially the interaction with complex data (e.g. medical or geographical data), which until today mostly rely on mice and ...
Development strategies for tangible interaction on horizontal surfaces
TEI '10: Proceedings of the fourth international conference on Tangible, embedded, and embodied interactionTangible interactions on horizontal surfaces are increasingly relevant for collaborative applications, embodied interaction, musical performance, and interaction with 3D information. This unique studio opportunity introduces approaches to developing ...
Exploring 3D manipulation on large stereoscopic displays
PerDis '16: Proceedings of the 5th ACM International Symposium on Pervasive DisplaysIn the last years stereoscopic 3D has seen a drastic increase in popularity especially in terms of consumer-ready hardware and software. While the technology for input (smart-phone, Kinect, etc.) as well as output (passive/active stereoscopic and auto-...
Comments