A survey of glove-based input

  title={A survey of glove-based input},
  author={David J. Sturman and David Zeltzer},
  journal={IEEE Computer Graphics and Applications},
Clumsy intermediary devices constrain our interaction with computers and their applications. Glove-based input devices let us apply our manual dexterity to the task. We provide a basis for understanding the field by describing key hand-tracking technologies and applications using glove-based input. The bulk of development in glove-based input has taken place very recently, and not all of it is easily accessible in the literature. We present a cross-section of the field to date. Hand-tracking… 

Figures from this paper

The Image-Based Data Glove
The design of an image-based data glove (IBDG) prototype suitable for finger sensible applications, like virtual objects manipulation and interaction approaches is described.
A Conceptual Image-Based Data Glove for Computer-Human Interaction
The design of an image-based data glove (IBDG) prototype suitable for finger sensible applications, like virtual objects manipulation and interaction approaches is described.
Tracking hands in action for gesture-based computer input
New methods for markerless tracking of the full articulated motion of hands and using tracked motion for gesture-based computer input are introduced and it is shown that even limiting hand tracking to only fingertips can enable new input methods for small form factor devices such as smartphones.
Development of a wearable input device based on human hand-motions recognition
A new wearable input device recognizing human hand-motion, a keyglove, is proposed that can resolve the disadvantages of the conventional input devices and can be adapted to the mobile computing environment.
Real-Time and Embedded Detection of Hand Gestures with an IMU-Based Glove
A data glove prototype comprising a glove-embedded gesture classifier utilizing data from Inertial Measurement Units (IMUs) in the fingertips, allowing fluent use of the gestures via Bluetooth-connected systems.
Ultrasonic glove input device for distance-based interactions
This paper presents distance-based interactions for wearable augmented reality systems enabled by an Ultrasonic Glove input device. The ultrasonic glove contains a tilt sensor and a pair of
Hand Gesture Modeling, Analysis, and Synthesis
A review of the most recent work related to hand gesture modeling, analysis and synthesis is presented and four major classes of hand gesture interface techniques are described: glove- based techniques, vision-based techniques, techniques that use drawing gestures, and other gesture analysis techniques.
Hand Talk-Implementation of a Gesture Recognizing Glove
The motivation for Hand Talk is to compare hand configurations with sign language charts and generate artificial speech which articulates the gestured words.
Real-time Embedded Recognition of Sign Language Alphabet Fingerspelling in an IMU-Based Glove
A data glove prototype, based on multiple small Inertial Measurement Units (IMUs), with a glove-embedded classifier for the french sign language, and results show that the system is capable of detecting the LSF alphabet.
Vision-based hand pose estimation: A review


A hand gesture interface device
Applications of the glove and its component technologies include its use in conjunction with a host computer which drives a real-time 3-dimensional model of the hand allowing the glove wearer to manipulate computer-generated objects as if they were real, interpretation of finger-spelling, evaluation of hand impairment in addition to providing an interface to a visual programming language.
A synthetic visual environment with hand gesturing and voice input
This paper describes a practical synthetic visual environment for use in CAD and teleoperation that uses a standard display and compute smooth shaded images using an AT&T Pixel Machine to track the hand, bringing the synthetic world into the same space as the hand.
Interacting with paper on the DigitalDesk
The DigitalDesk is built around an ordinary physical desk and can be used as such, but it has extra capabilities, including a video camera mounted above the desk that can detect where the user is pointing, and it can read documents that are placed on the desk.
Charade: remote control of objects using free-hand gestures
An application that uses hand gesture input to control a computer while giving a presentation and an interaction model, a notation for gestures, and a set of guidelines to design gestural command sets are presented.
Specifying gestures by example
GRANDMA, a toolkit for rapidly adding gestures to direct manipulation interfaces, and the trainable single-stroke gesture recognizer used by GRANDMA are described.
“Put-that-there”: Voice and gesture at the graphics interface
  • R. Bolt
  • Art, Computer Science
    SIGGRAPH '80
  • 1980
The work described herein involves the user commanding simple shapes about a large-screen graphics display surface, and because voice can be augmented with simultaneous pointing, the free usage of pronouns becomes possible, with a corresponding gain in naturalness and economy of expression.
An evaluation of 3-D object pointing using a field sequential stereoscopic display
These experiments show that it is possible to point at a 3-D object on a field sequential stereoscopic display with relatively high accuracy, however, pointing at objects in depth takes much more time than pointing at Objects in a plane.
Calibrating a VPL DataGlove for teleoperating the Utah/MIT hand
  • Jia-Xing Hong, X. Tan
  • Medicine
    Proceedings, 1989 International Conference on Robotics and Automation
  • 1989
A system able to control the Utah/MIT hand with the VPL DataGlove has been developed and can successfully implement various high-level tasks under the dataGlove wearer's control.
A computer music system that follows a human conductor
An electronic orchestra with a complex performance database and MIDI (Musical Instrument Digital Interface) controllers is presented. This system responds to the gestures of a conductor through a CCD
Hand gesture coding based on experiments using a hand gesture interface device
It would be ideal for computer-human interaction if a computer could understand human gestures and a hand gesture interface device, the VPL Data Glove TM, provides real-time information on a user's hand movement.