Corpus ID: 34062092

Formalization and Combination of Touch and Point Interaction

@inproceedings{Kammer2011FormalizationAC,
  title={Formalization and Combination of Touch and Point Interaction},
  author={Dietrich Kammer and Dana Henkens and Jan Wojdziak},
  year={2011}
}
Gestural interfaces have been the subject of research in the HCI community for a long time. In contrast to existing examples of multimodal interfaces, which combine freehand gestures with speech, this paper proposes the combination of both touch and point gestures. Touch requires users to actually initiate contact with the interaction surface, whereas point interaction allows them to use hand movements in the proximity of the user interface. Formalization is deemed necessary to subsequently… Expand
1 Citations

Figures from this paper

References

SHOWING 1-10 OF 11 REFERENCES
Towards a formalization of multi-touch gestures
TLDR
A discussion of strategies towards a formalization of gestural interaction on multi-touch surfaces and a test environment is presented, showing the applicability and benefit within multi- touch frameworks. Expand
A study of hand shape use in tabletop gesture interaction
TLDR
Implications for tabletop gesture interaction design include suggestions for the use of different hands shapes for input, the desirability of combined touch screen and computer vision gesture input, and possibilities for flexible two-handed interaction. Expand
Integrating Point and Touch for Interaction with Digital Tabletop Displays
TLDR
Results from studies investigating tractorBeam's use for target selection, docking, and puzzle tasks give some positive results, but further tests on its usefulness in collaborative activities are necessary. Expand
Extending touch: towards interaction with large-scale surfaces
TLDR
The system is based on 3D reconstruction, using standard RGB cameras only, and allows seamless switching between touch and pointing, even while interacting, and found that users preferred the system to a touch-only system, because they had more freedom during interaction and could solve the presented task significantly faster. Expand
Midas: a declarative multi-touch interaction framework
TLDR
This work presents Midas, a declarative model for the definition and detection of multi-touch gestures where gestures are expressed via logical rules over a set of input facts and highlights how the rule-based language approach leads to improvements in gesture extensibility and reusability. Expand
“Put-that-there”: Voice and gesture at the graphics interface
  • R. Bolt
  • Computer Science
  • SIGGRAPH '80
  • 1980
TLDR
The work described herein involves the user commanding simple shapes about a large-screen graphics display surface, and because voice can be augmented with simultaneous pointing, the free usage of pronouns becomes possible, with a corresponding gain in naturalness and economy of expression. Expand
Multi-point interactions with immersive omnidirectional visualizations in a dome
TLDR
The Pinch-the-Sky Dome is an immersive installation where several users can interact simultaneously with omnidirectional data using freehand gestures and infra-red laser pointers to provide a highly immersive and interactive experience to several users inside the dome. Expand
Multimodal References in GEORAL TACTILE
The paper specifically presents how linguistic (oral) and tactile references are dealt with in the GEORAL system which has already been described in other papers. In this system, users can formulateExpand
Situated Interaction with Ambient Information: Facilitating Awareness and Communication in Ubiquitous Work Environments
TLDR
The approach is introduced as well as examples of realizations for situated interaction in the context of future work environments going not only beyond traditional PC-based work places but also beyond electronic meeting rooms and roomware components previously developed by us. Expand
SmartKom mobile: intelligent ubiquitous user interaction
TLDR
The mobile version of the SmartKom system, which is device-independent and realizes multi-modal interaction in cars and on mobile devices such as PDAs, is presented. Expand
...
1
2
...