Learn More
We propose a method for computer-based speed writing, SHARK (shorthand aided rapid keyboarding), which augments stylus keyboarding with shorthand gesturing. SHARK defines a shorthand symbol for each word according to its movement pattern on an optimized stylus keyboard. The key principles for the SHARK design include high efficiency stemmed from layout(More)
1. intRoDuCtion Throughout human civilization, text has been an indispensable channel of communication. Modern computers equipped with desktop keyboards have dramatically increased the ease and volume of text-based communication in the form of email, text chat, and Web posting. As computing technologies expanded beyond the confines of the desktop, the need(More)
Zhai and Kristensson (2003) presented a method of speed-writing for pen-based computing which utilizes gesturing on a stylus keyboard for familiar words and tapping for others. In SHARK<sup>2</sup>:, we eliminated the necessity to alternate between the two modes of writing, allowing any word in a large vocabulary (e.g. 10,000-20,000 words) to be(More)
In this paper we present a new bimanual markerless gesture interface for 3D full-body motion tracking sensors, such as the Kinect. Our interface uses a probabilistic algorithm to incrementally predict users' intended one-handed and twohanded gestures while they are still being articulated. It supports scale and translation invariant recognition of(More)
Interface designers normally strive for a design that minimises the user's effort. However, when the design's objective is to train users to interact with interfaces that are highly dependent on spatial properties (e.g. keypad layout or gesture shapes) we contend that designers should consider explicitly increasing the mental effort of interaction. To test(More)
were members of this project during their internships at the IBM Almaden Research Center in different periods of time. ABSTRACT It is generally recognized that today's frontier of HCI research lies beyond the traditional desktop computers whose GUI interfaces were built on the foundation of display—pointing device—full keyboard. Many interface challenges(More)
The potential for using stroke gestures to enter, retrieve and select commands and text has been recently unleashed by the popularity of touchscreen devices. This monograph provides a state-of-the-art inte-grative review of a body of human–computer interaction research on stroke gestures. It begins with an analysis of the design dimensions of stroke(More)
We present a technique that enables continuous recognition and visualization of pen strokes and touch-screen gestures. We describe an incremental recognition algorithm that provides probability distributions over template classes as a function of users' partial or complete stroke articulations. We show that this algorithm can predict users' intended(More)
The high incidence of literacy deficits among people with severe speech impairments (SSI) has been well documented. Without literacy skills, people with SSI are unable to effectively use orthographic-based communication systems to generate novel linguistic items in spontaneous conversation. To address this problem, phoneme-based communication systems have(More)
Cognitive neuroscience defines the sense of agency as the experience of controlling one's own actions and, through this control, affecting the external world. We believe that the sense of personal agency is a key factor in how people experience interactions with technology. This paper draws on theoretical perspectives in cognitive neuroscience and describes(More)