ABSTRACT
Over the past few years, multi-touch user interfaces emerged from research prototypes into mass market products. This evolution has been mainly driven by innovative devices such as Apple's iPhone or Microsoft's Surface tabletop computer. Unfortunately, there seems to be a lack of software engineering abstractions in existing multi-touch development frameworks. Many multi-touch applications are based on hard-coded procedural low level event processing. This leads to proprietary solutions with a lack of gesture extensibility and cross-application reusability. We present Midas, a declarative model for the definition and detection of multi-touch gestures where gestures are expressed via logical rules over a set of input facts. We highlight how our rule-based language approach leads to improvements in gesture extensibility and reusability. Last but not least, we introduce JMidas, an instantiation of Midas for the Java programming language and describe how JMidas has been applied to implement a number of innovative multi-touch gestures.
- F. P. Brooks, Jr. No Silver Bullet: Essence and Accidents of Software Engineering. IEEE Computer, 20(4):10--19, April 1987. Google ScholarDigital Library
- F. Echtler and G. Klinker. A Multitouch Software Architecture. In Proc. of NordiCHI 2008, 5th Nordic Conference on Human-Computer Interaction, pages 463--466, Lund, Sweden, 2008. Google ScholarDigital Library
- C. L. Forgy. RRete: A Fast Algorithm for the Many Pattern/Many Object Pattern Match Problem. Artificial Intelligence, 19(1):17--37, 1982.Google ScholarDigital Library
- E. Friedman-Hill. Jess in Action: Java Rule-Based Systems. Manning Publications, July 2003. Google ScholarDigital Library
- L. Hoste. Experiments with the SunSPOT Accelerometer. Project report, Vrije Universiteit Brussel, 2009.Google Scholar
- M. Kaltenbrunner, T. Bovermann, R. Bencina, and E. Costanza. TUIO: A Protocol for Table-Top Tangible User Interfaces. In Proc. of GW 2005, 6th Intl. Workshop on Gesture in Human-Computer Interaction and Simulation, Ile de Berder, France, May 2005.Google Scholar
- D. Kammer, M. Keck, G. Freitag, and M. Wacker. Taxonomy and Overview of Multi-touch Frameworks: Architecture, Scope and Features. In Proc. of Workshop on Engineering Patterns for Multi-Touch Interfaces, Berlin, Germany, June 2010.Google Scholar
- I. Maier, T. Rompf, and M. Odersky. Deprecating the Observer Pattern. Technical Report EPFL-REPORT-148043, Ecole Polytechnique Fédeérale de Lausanne, Lausanne, Switzerland, 2010.Google Scholar
- L. H. Nakatani and J. A. Rohrlich. Soft Machines: A Philosophy of User-Computer Interface Design. In Proc. of CHI '83, ACM Conference on Human Factors in Computing Systems, pages 19--23, Boston, USA, December 1983. Google ScholarDigital Library
- A. D. Nardi. Grafiti: Gesture Recognition mAnagement Framework for Interactive Tabletop Interfaces. Master's thesis, University of Pisa, 2008.Google Scholar
- P. Ramanahally, S. Gilbert, T. Niedzielski, D. Velázquez, and C. Anagnost. Sparsh UI: A Multi-Touch Framework for Collaboration and Modular Gesture Recognition. In Proc. of WINVR 2009, Conference on Innovative Virtual Reality, pages 1--6, Chalon-sur-SaoÆne, France, February 2009.Google ScholarCross Ref
- D. Rubine. Specifying Gestures by Example. In Proc. of ACM SIGGRAPH '91, 18th Intl. Conference on Computer Graphics and Interactive Techniques, pages 329--337, Las Vegas, USA, August 1991. Google ScholarDigital Library
- B. Signer, U. Kurmann, and M. C. Norrie. iGesture: A General Gesture Recognition Framework. In Proc. of ICDAR 2007, 9th Intl. Conference on Document Analysis and Recognition, pages 954--958, Curitiba, Brazil, September 2007. Google ScholarDigital Library
- J. Stewart, B. B. Bederson, and A. Druin. Single Display Groupware: A Model for Co-present Collaboration. In Proc. of CHI '99, ACM Conference on Human Factors in Computing Systems, pages 286--293, Pittsburgh, USA, May 1999. Google ScholarDigital Library
Index Terms
- Midas: a declarative multi-touch interaction framework
Recommendations
Multi-touch interaction for tasking robots
HRI '10: Proceedings of the 5th ACM/IEEE international conference on Human-robot interactionThe objective is to develop a mobile human-robot interface that is optimized for multi-touch input. Our existing interface was designed for mouse and keyboard input and was later adopted for voice and touch interaction. A new multi-touch interface ...
Bimanual Interaction with Interscopic Multi-Touch Surfaces
INTERACT '09: Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part IIMulti-touch interaction has received considerable attention in the last few years, in particular for natural two-dimensional (2D) interaction. However, many application areas deal with three-dimensional (3D) data and require intuitive 3D interaction ...
Combining bimanual interaction and teleportation for 3D manipulation on multi-touch wall-sized displays
VRST '16: Proceedings of the 22nd ACM Conference on Virtual Reality Software and TechnologyWhile multi-touch devices are well established in our everyday life, they are currently becoming larger and larger. Large screens such as wall-sized displays are now equipped with multi-touch capabilities. Multi-touch wall-sized displays will become ...
Comments