ABSTRACT
We propose a multimodal user interface system using pen and voice to draw diagrams, especially system configuration figures. We have built a system called TalkingDraw, which supports real time drawing in talking and does not interfere natural talking.
- Oviatt Sharon and Philip Cohen. 2000. Perceptual user interfaces: multimodal interfaces that process what comes naturally. Commun. ACM 50, 3 (Mar. 2000), 286-304. Google ScholarDigital Library
- Edward C. Kaiser. 2005. Multimodal new vocabulary recognition through speech and handwriting in a whiteboard scheduling application. In Proceedings of the 10th international conference on Intelligent user interfaces (IUI '05). ACM Press, 51-58. Google ScholarDigital Library
- Ken Hinckley, Koji Yatani et al. 2010. Pen+ touch= new tools. In Proceedings of the 23nd annual ACM symposium on User interface software and technology (UIST '10). ACM Press, New York, NY, 27-36. Google ScholarDigital Library
- Yoon Dongwook, Nicholas Chen, François Guimbretière, and Abigail Sellen. 2014. RichReview: blending ink, speech, and gesture to support collaborative document review. In Proceedings of the 27th annual ACM symposium on User interface software and technology (UIST '14). ACM Press, Honolulu, HI, 481-490. Google ScholarDigital Library
- Xingya Xu, Jinxi Liao, and Hirohito Shibata. 2017. Drawing in Talking: Using Pen and Voice for Drawing System Configuration Figures in Talking. In Proceedings of the 25th International Display Workshops (IDW '17). Fukuoka, Japan.Google Scholar
- Bill N. Schilit, Gene Golovchinsky, and Morgan N. Price. 1998. Beyond paper: supporting active reading with free form digital ink annotations. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '98). ACM Press/Addison-Wesley Publishing Co., Los Angeles, CA, 249-256. Google ScholarDigital Library
- Ken Hinckley, Shengdong Zhao, Raman Sarin, Patrick Baudisch, Edward Cutrell, Michael Shilman, and Desney Tan. 2007. InkSeine: In Situ search for active note taking. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM Press, San Jose, CA, 251-260. Google ScholarDigital Library
- Takeo Igarashi, Satoshi Matsuoka, and Hidehiko Tanaka. 1999. Teddy: a sketching interface for 3D freeform design. In Proceedings of the 26th annual conference on Computer graphics and interactive techniques (SIGGRAPH '99). ACM Press/Addison-Wesley Publishing Co., New York, NY, 409-416. Google ScholarDigital Library
- Mueller Pam A., and Daniel M. Oppenheimer. 2014. The pen is mightier than the keyboard: Advantages of longhand over laptop note taking. Psychological science Vol.25, Iss.6, 1159-1168.Google Scholar
- H. Shibata, and K. Omura. 2017. Cognitive loads of handwriting and typing: The impact for memorization in a dual task method. In Proceedings of the 25th International Display Workshops (IDW '17). Fukuoka, Japan.Google Scholar
- Muhd D. Hamzah, Shun'ichi Tano, Mitsuru Iwata, and Tomonori Hashiyama. 2006. Effectiveness of annotating by hand for non-alphabetical languages. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '06). ACM Press, Montréal, Québec, Canada, 841-850. Google ScholarDigital Library
- Radu-Daniel Vatavu, Lisa Anthony, and Jacob O. Wobbrock. 2012. Gestures as point clouds: A $P recognizer for user interface prototypes. In Proceedings of the 14th ACM international conference on Multimodal interaction (ICMI '12). ACM Press, Santa Monica, CA, 273-280. Google ScholarDigital Library
Recommendations
Toward Multimodal Interpretation in a Natural Speech/Gesture Interface
ICIIS '99: Proceedings of the 1999 International Conference on Information Intelligence and SystemsHand gestures and speech comprise the most important modalities of human to human interaction. Motivated by this, there has been a considerable interest in incorporating these modalities for "natural" human-computer interaction (HCI) particularly within ...
A suggestive interface for 3D drawing
UIST '01: Proceedings of the 14th annual ACM symposium on User interface software and technologyThis paper introduces a new type of interface for 3D drawings that improves the usability of gestural interfaces and augments typical command-based modeling systems. In our suggestive interface, the user gives hints about a desired operation to the ...
Evaluation of a multimodal interface for 3D terrain visualization
VIS '02: Proceedings of the conference on Visualization '02Novel speech and/or gesture interfaces are candidates for use in future mobile or ubiquitous applications. This paper describes an evaluation of various interfaces for visual navigation of a whole Earth 3D terrain model. A mouse driven interface, a ...
Comments