Learn More
We propose a method for computer-based speed writing, SHARK (shorthand aided rapid keyboarding), which augments stylus keyboarding with shorthand gesturing. SHARK defines a shorthand symbol for each word according to its movement pattern on an optimized stylus keyboard. The key principles for the SHARK design include high efficiency stemmed from layout(More)
1. intRoDuCtion Throughout human civilization, text has been an indispensable channel of communication. Modern computers equipped with desktop keyboards have dramatically increased the ease and volume of text-based communication in the form of email, text chat, and Web posting. As computing technologies expanded beyond the confines of the desktop, the need(More)
Zhai and Kristensson (2003) presented a method of speed-writing for pen-based computing which utilizes gesturing on a stylus keyboard for familiar words and tapping for others. In SHARK<sup>2</sup>:, we eliminated the necessity to alternate between the two modes of writing, allowing any word in a large vocabulary (e.g. 10,000-20,000 words) to be(More)
In this paper we present a new bimanual markerless gesture interface for 3D full-body motion tracking sensors, such as the Kinect. Our interface uses a probabilistic algorithm to incrementally predict users' intended one-handed and twohanded gestures while they are still being articulated. It supports scale and translation invariant recognition of(More)
Interface designers normally strive for a design that minimises the user's effort. However, when the design's objective is to train users to interact with interfaces that are highly dependent on spatial properties (e.g. keypad layout or gesture shapes) we contend that designers should consider explicitly increasing the mental effort of interaction. To test(More)
were members of this project during their internships at the IBM Almaden Research Center in different periods of time. ABSTRACT It is generally recognized that today's frontier of HCI research lies beyond the traditional desktop computers whose GUI interfaces were built on the foundation of display—pointing device—full keyboard. Many interface challenges(More)
The potential for using stroke gestures to enter, retrieve and select commands and text has been recently unleashed by the popularity of touchscreen devices. This monograph provides a state-of-the-art inte-grative review of a body of human–computer interaction research on stroke gestures. It begins with an analysis of the design dimensions of stroke(More)
We investigate ways to improve recognition accuracy on spoken corrections. We show that a variety of simple techniques can greatly improve the accuracy on corrections. We further develop a flexible merge model that improves accuracy by combining information from the original recognition and the spoken correction. Our merge model operates on word confusion(More)
We study the performance and user experience of two popular mainstream mobile text entry methods: the Smart Touch Keyboard (STK) and the Smart Gesture Keyboard (SGK). Our first study is a lab-based ten-session text entry experiment. In our second study we use a new text entry evaluation methodology based on the experience sampling method (ESM). In the ESM(More)
The high incidence of literacy deficits among people with severe speech impairments (SSI) has been well documented. Without literacy skills, people with SSI are unable to effectively use orthographic-based communication systems to generate novel linguistic items in spontaneous conversation. To address this problem, phoneme-based communication systems have(More)