Sensing techniques for tablet+stylus interaction

@article{Hinckley2014SensingTF,
  title={Sensing techniques for tablet+stylus interaction},
  author={Ken Hinckley and Michel Pahud and Hrvoje Benko and Pourang Irani and François Guimbreti{\`e}re and Marcel Gavriliu and Xiang 'Anthony' Chen and Fabrice Matulic and William Buxton and Andrew D. Wilson},
  journal={Proceedings of the 27th annual ACM symposium on User interface software and technology},
  year={2014}
}
  • K. Hinckley, M. Pahud, Andrew D. Wilson
  • Published 5 October 2014
  • Computer Science
  • Proceedings of the 27th annual ACM symposium on User interface software and technology
We explore grip and motion sensing to afford new techniques that leverage how users naturally manipulate tablet and stylus devices during pen + touch interaction. We can detect whether the user holds the pen in a writing grip or tucked between his fingers. We can distinguish bare-handed inputs, such as drag and pinch gestures produced by the nonpreferred hand, from touch gestures produced by the hand holding the pen, which necessarily impart a detectable motion signal to the stylus. We can… 

Figures from this paper

Sensing Posture-Aware Pen+Touch Interaction on Tablets
TLDR
This work proposes sensing techniques that transition between various nuances of mobile and stationary use via postural awareness, and shows how these sensors enable posture-aware pen+touch techniques that adapt interaction and morph user interface elements to suit fine-grained contexts of body-, arm-, hand-, and grip-centric frames of reference.
Pre-Touch Sensing for Mobile Interaction
TLDR
A self-capacitance touchscreen that can sense multiple fingers above a mobile device, as well as grip around the screen's edges is explored, which enables hybrid touch + hover gestures, and illustrates how pre-touch sensing offers an intriguing new back-channel for mobile interaction.
Sensing Tablet Grasp + Micro-mobility for Active Reading
TLDR
Capacitive grip sensing and inertial motion are employed to explore the design space of combined grasp + micro-mobility by considering three classes of technique in the context of active reading, which opens up new possibilities for both individual and collaborative interaction with electronic documents.
Gaze and Touch Interaction on Tablets
TLDR
This work proposes gaze and touch input, where touches redirect to the gaze target, and presents a user study comparing this technique to direct-touch, showing that users are slightly slower but can utilise one-handed use with less physical effort.
Get a Grip: Evaluating Grip Gestures for VR Input using a Lightweight Pen
TLDR
The characteristics and potential of using a pen as a VR input device, and potential applications enabled by VR pen input and grip postures are investigated.
FlexCase: Enhancing Mobile Interaction with a Flexible Sensing and Display Cover
TLDR
The rich design space of FlexCase is explored and how touch and flex sensing can be combined to support a novel type of gestures, which is called Grip & Bend gestures, and the underlying technology and gesture sensing algorithms are described.
PenShaft: Enabling Pen Shaft Detection and Interaction for Touchscreens
TLDR
PenShaft is a battery-free, easy to implement solution for augmenting the shaft of a capacitive pen with interactive capabilities by applying conductive materials in a specific pattern on the pen’s shaft that enables on-pen interactions and whole-shaft interactions, such as rotating and dragging the stylus.
RealPen: Providing Realism in Handwriting Tasks on Touch Surfaces using Auditory-Tactile Feedback
TLDR
RealPen is presented, an augmented stylus for capacitive tablet screens that recreates the physical sensation of writing on paper with a pencil, ball-point pen or marker pen by regenerating the friction-induced oscillation and sound of a real writing tool in contact with paper.
WatchPen: Using Cross-Device Interaction Concepts to Augment Pen-Based Interaction
TLDR
WatchPen, a smartwatch mounted on a passive, capacitive stylus that senses the usage context and leverages it for expression, and contains tools and parameters within the display and acts as an on-demand output, provides users with a dynamic relationship between inputs and outputs.
Contact-sensing Input Device Manipulation and Recall
TLDR
A cuboid tangible pen-like input device similar to Vogel and Casiez’s Conté is studied, and learning and recall of commands located on physical landmarks on the exterior of a 3D tangible input device are evaluated in comparison with a 2D spatial interface.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 66 REFERENCES
Grips and gestures on a multi-touch pen
TLDR
This study shows that both swipe and double tap gestures are comparable in performance to commonly employed barrel buttons without its disadvantages, and can be successfully and comfortably performed regardless of the rotation of the pen or how the user grips it.
Pen + touch = new tools
TLDR
A division of labor between pen and touch is advocated: the pen writes, touch manipulates, and the combination of pen + touch yields new tools that helps the UI designer to avoid encumbrances such as physical buttons, persistent modes, or widgets that detract from the user's focus on the workspace.
Sensor synaesthesia: touch in motion, and motion in touch
TLDR
This work explores techniques for hand-held devices that leverage the multimodal combination of touch and motion, and considers the reverse perspective, that of motion-enhanced touch, which uses motion sensors to probe what happens underneath the surface of touch.
Motion and context sensing techniques for pen computing
TLDR
The initial results suggest that sensor-enhanced stylus input offers a potentially rich modality to augment interaction with slate computers.
HandSense: discriminating different ways of grasping and holding a tangible user interface
TLDR
The HandSense prototype employs capacitive sensors for detecting when it is touched or held against a body part, and is able to correctly classify over 80 percent of all touches, discriminating six different ways of touching the device.
BiTouch and BiPad: designing bimanual interaction for hand-held tablets
TLDR
The BiTouch design space is designed, which introduces a support function in the kinematic chain model for interacting with hand-held tablets, and BiPad is developed, a toolkit for creating bimanual tablet interaction with the thumb or the fingers of the supporting hand.
GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones
We introduce GripSense, a system that leverages mobile device touchscreens and their built-in inertial sensors and vibration motor to infer hand postures including one- or two-handed interaction, use
iGrasp: grasp-based adaptive keyboard for mobile devices
TLDR
iGrasp is presented, which automatically adapts the layout and position of virtual keyboards based on how and where users are grasping the devices without requiring explicit user input.
FlexAura: a flexible near-surface range sensor
TLDR
A new optical range sensor design based on high power infrared LEDs and photo-transistors, which can be fabricat-ed on a flexible PCB and wrapped around a wide variety of graspable objects including pens, mice, smartphones, and slates is presented.
Hand Grip Pattern Recognition for Mobile User Interfaces
TLDR
This paper presents a novel user interface for handheld mobile devices by recognizing hand grip patterns by using pattern recognition techniques for identifying the users' band grips from the touch sensors.
...
1
2
3
4
5
...