Type-hover-swipe in 96 bytes: a motion sensing mechanical keyboard

@article{Taylor2014TypehoverswipeI9,
  title={Type-hover-swipe in 96 bytes: a motion sensing mechanical keyboard},
  author={Stuart Taylor and Cem Keskin and Otmar Hilliges and Shahram Izadi and John Helmes},
  journal={Proceedings of the SIGCHI Conference on Human Factors in Computing Systems},
  year={2014}
}
  • Stuart Taylor, Cem Keskin, J. Helmes
  • Published 26 April 2014
  • Computer Science
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
We present a new type of augmented mechanical keyboard, capable of sensing rich and expressive motion gestures performed both on and directly above the device. Our hardware comprises of low-resolution matrix of infrared (IR) proximity sensors interspersed between the keys of a regular mechanical keyboard. This results in coarse but high frame-rate motion data. We extend a machine learning algorithm, traditionally used for static classification only, to robustly support dynamic, temporal… 
Keyboard Surface Interaction: Making the keyboard into a pointing device
TLDR
Keyboard Surface Interaction (KSI), an interaction approach that turns the surface of a keyboard into an interaction surface and allows users to rest their hands on the keyboard at all times to minimize fatigue, is introduced.
GestAKey: Get More Done with Just-a-Key on a Keyboard
TLDR
GestAKey is presented, a technique to enable multifunctional keystrokes on a single key, providing new interaction possibilities on the familiar keyboards, and several GestAKey-enabled proof-of-concept applications are presented.
FlickBoard: Enabling Trackpad Interaction with Automatic Mode Switching on a Capacitive-sensing Keyboard
TLDR
FlickBoard is presented, which combines a touchpad and a keyboard into the same interaction area to reduce hand movement between a separate keyboard and touchpad, and the first system capable of combining a trackpad andA keyboard into an single interaction area without the need for external switches.
Extending Keyboard Shortcuts with Arm and Wrist Rotation Gestures
TLDR
Results show that arm and wrist rotations during keystrokes can be used for interaction, yet challenges remain for integration into practical applications.
A method for using one finger to press three separate keys on a three-dimensional keyboard designed to be mounted on a mouse
TLDR
It is concluded that the proposed 3-D keyboard can be used accurately with one finger while reducing the distance fingers must move, and in the process, reduce finger-moving distance.
Interacting with Soli: Exploring Fine-Grained Dynamic Gesture Recognition in the Radio-Frequency Spectrum
TLDR
A novel machine learning architecture, specifically designed for radio-frequency based gesture recognition, based on an end-to-end trained combination of deep convolutional and recurrent neural networks, for Google's Soli sensor.
In-air gestures around unmodified mobile devices
TLDR
A novel machine learning based algorithm extending the interaction space around mobile devices to augment and enrich the existing interaction vocabulary using gestures, which removes screen real-estate issues on small touchscreens, and allows input to be expanded to the 3D space around the device.
Cyclops: Wearable and Single-Piece Full-Body Gesture Input Devices
TLDR
Cyclops turns posture recognition into a highly controllable computer vision problem, and four example applications of interactive bodily workout, a mobile racing game that involves hands and feet, a full-body virtual reality system, and interaction with a tangible toy are presented.
GestAKey: Touch Interaction on Individual Keycaps
TLDR
GestAKey is proposed, leveraging location and motion of the touch on individual keycaps to augment the functionalities of existing keystrokes with simultaneous gestures, and shows that GestAKey has comparable performance with hotkey.
CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring
TLDR
CyclopsRing is the first ring-wearable device that supports whole-hand and context-aware interactions and a set of interaction techniques including on-finger pinch-and-slide input, in-air pinch- and-motion input, palm-writing input, and their interactions with the environ- mental contexts.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 45 REFERENCES
Multi-Touch: A New Tactile 2-D Gesture Interface for Human-Computer Interaction
TLDR
The present approach interprets asynchronous touches on the surface as conventional single-finger typing, while motions initiated by chords are interpreted as pointing, clicking, gesture commands, or hand resting.
SHARK2: a large vocabulary shorthand writing system for pen-based computers
TLDR
The architecture, algorithms and interfaces of a high-capacity multi-channel pen-gesture recognition system that supports a gradual and seamless transition from visually guided tracing to recall-based gesturing are designed and implemented.
The UnMousePad: an interpolating multi-touch force-sensing input pad
TLDR
The UnMousePad is a flexible and inexpensive multitouch input device based on a newly developed pressure-sensing principle called Interpolating Force Sensitive Resistance (IFSR), a general pressure imaging technology that can be incorporated into shoes, tennis racquets, hospital beds, factory assembly lines and many other applications.
Command strokes with and without preview: using pen gestures on keyboard for command selection
TLDR
A new command selection method that provides an alternative to pull-down menus in pen-based mobile interfaces and allows users to directly select commands from a very large set without the need to traverse menu hierarchies is presented.
Touch&Type: a novel pointing device for notebook computers
TLDR
While the mouse outperformed its two counterparts, Touch&Type™™ was found to be superior to the conventional touch pad (after a short learning period) with a confidence level of 73%.
Grips and gestures on a multi-touch pen
TLDR
This study shows that both swipe and double tap gestures are comparable in performance to commonly employed barrel buttons without its disadvantages, and can be successfully and comfortably performed regardless of the rotation of the pen or how the user grips it.
FlexAura: a flexible near-surface range sensor
TLDR
A new optical range sensor design based on high power infrared LEDs and photo-transistors, which can be fabricat-ed on a flexible PCB and wrapped around a wide variety of graspable objects including pens, mice, smartphones, and slates is presented.
LongPad: a touchpad using the entire area below the keyboard of a laptop computer
TLDR
LongPad is developed, a proximity-sensing optical touchpad that is as wide as a laptop keyboard, and implemented a palm rejection algorithm that utilizes proximity images from LongPad, which observed that LongPad rejected palm touches almost perfectly while participants were repeating typing and pointing tasks.
Touch-display keyboards: transforming keyboards into interactive surfaces
TLDR
The Touch-Display Keyboard is introduced, a novel keyboard that combines the physical-ergonomic qualities of the conventional keyboard with dynamic display and touch-sensing embedded in each key, which effectively transforms the keyboard into an interactive surface that is seamlessly integrated with the interaction space of GUIs.
DGTS: Integrated Typing and Pointing
TLDR
The objective of this work is to replace computer mice and touchpads by integrating capacitive sensing into a layer within the keyboard thereby reducing the space required for pointing devices.
...
1
2
3
4
5
...