• Corpus ID: 6207455

Keyboard Surface Interaction: Making the keyboard into a pointing device

  title={Keyboard Surface Interaction: Making the keyboard into a pointing device},
  author={Julian Ramos and Z. Li and Johana Rosas and Nikola Banovic and Jennifer Mankoff and Anind K. Dey},
Pointing devices that reside on the keyboard can reduce the overall time needed to perform mixed pointing and typing tasks, since the hand of the user does not have to reach for the pointing device. However, previous implementations of this kind of device have a higher movement time compared to the mouse and trackpad due to large error rate, low speed and spatial resolution. In this paper we introduce Keyboard Surface Interaction (KSI), an interaction approach that turns the surface of a… 
Re-envisioning the Keyboard as a Spatial User Interface
This work compares a keyboard surface interaction system (Fingers) designed to preserve the visuospatial relationships between web page elements to a traditional screen reader (VoiceOver) to learn which has better interaction effectiveness measured in terms of time spent and interactions per task on four simulated shopping tasks.
ReType: Quick Text Editing with Keyboard and Gaze
It is concluded that the gaze-augmented user interface can make common interactions more fluent, especially for professional keyboard users.
Evaluating Text Entry in Virtual Reality using a Touch-sensitive Physical Keyboard
The results indicate that text entry using touch-sensitive physical keyboards can be as efficient as the fingertip visualization, but that results vary between experienced and inexperienced typists.
Nonvisual Interaction Techniques at the Keyboard Surface
This work introduces Spatial Region Interaction Techniques (SPRITEs) for nonvisual access: a novel method for navigating two-dimensional structures using the keyboard surface and shows that three times as many participants were able to complete spatial tasks with SPRITEs than with their preferred current technology.
ReconViguRation: Reconfiguring Physical Keyboards in Virtual Reality
A set of input and output mappings for reconfiguring the virtual presentation of physical keyboards in VR are explored and the resulting design space is probed by specifically designing, implementing and evaluating nine VR-relevant applications.
A Structure Design of Virtual and Real Fusion Intelligent Equipment and Multimodal Navigational Interaction Algorithm
Students can use virtual and real fusion interactions through tactile and auditory channels, they can independently complete simulations and learning experiments based on experimental navigation and the MNIVRFCL can reduce the time consumption and improves the students’ interaction efficiency.
Mixed Reality Interaction Techniques
This chapter gives an overview of interaction techniques for mixed reality including augmented and virtual reality (AR/VR) and the combination of multiple modalities in multisensory and multimodal interaction.
Typealike: Near-Keyboard Hand Postures for Expanded Laptop Interaction
Fig. 1. Typealike postures are postures formed using the left or right hand, an open or closed hand form, and different wrist orientations, all further distinguished by hand position when on, just


The effect of reducing homing time on the speed of a finger-controlled isometric pointing device
The experiment shows that the mouse is the faster pointing device, and that a finger controlled device complies with Fitts’ law, and it is shown that efforts to design faster pointing devices should focus on increasing the FITTs' law Index of Performance rather than on reducing the homing time.
Touch&Type: a novel pointing device for notebook computers
While the mouse outperformed its two counterparts, Touch&Type™™ was found to be superior to the conventional touch pad (after a short learning period) with a confidence level of 73%.
Touch-display keyboards: transforming keyboards into interactive surfaces
The Touch-Display Keyboard is introduced, a novel keyboard that combines the physical-ergonomic qualities of the conventional keyboard with dynamic display and touch-sensing embedded in each key, which effectively transforms the keyboard into an interactive surface that is seamlessly integrated with the interaction space of GUIs.
Comparison of Mouse, Touchpad and Multitouch Input Technologies
Twelve subjects used fixed-angle split keyboards with a conventional mouse, an integrated touchpad, or an integrated MultiTouch surface. Three tasks were tested in each condition: data entry, text
Scroll, tilt or move it: using mobile phones to continuously control pointers on large public displays
A comparative evaluation of three different interaction techniques for continuous control of a pointer located on a remote display using a mobile phone revealed that while Move and Tilt can be faster, they also introduce higher error rates for selection tasks.
Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys, for text selection on a CRT
Four devices are ev aluated with respect to how ra pid ly they can be used to sc lccl text on a CRT di splay. The mouse is fo und to be fastest on all counts and also to have the lowest error
Type-hover-swipe in 96 bytes: a motion sensing mechanical keyboard
A new type of augmented mechanical keyboard, capable of sensing rich and expressive motion gestures performed both on and directly above the device, is presented and a machine learning algorithm is extended to robustly support dynamic, temporal gestures.
Rubbing and tapping for precise and rapid selection on touch-screen displays
Two families of techniques that use zooming to make possible precise interaction on passive touch screens, rubbing and tapping, are introduced and it is shown how the techniques can be used for fluid interaction in an image viewer and in Google Maps.
Carpal tunnel syndrome due to keyboarding and mouse tasks: a review
Abstract So far, many different studies have examined possible implications of typing related posture and activity on carpal tunnel syndrome (CTS) incidence. Although they tend to present the
FlowMouse: A Computer Vision-Based Pointing and Gesture Input Device
FlowMouse is introduced, a computer vision-based pointing device and gesture input system that uses optical flow techniques to model the motion of the hand and a capacitive touch sensor to enable and disable interaction.