• Corpus ID: 14869757

Eype-Using Eye-Traces for Eye-Typing

@inproceedings{Hoppe2013EypeUsingEF,
  title={Eype-Using Eye-Traces for Eye-Typing},
  author={Sabrina Hoppe and Florian Daiber},
  year={2013}
}
Copyright is held by the author/owner(s). CHI’13 Workshop on Grand Challenges in Text Entry, April 28, 2013, Paris, France. Abstract Current eye-typing systems are suffering from the needed dwell timeout which limits the possible entry rate. In this position paper we discuss how the usage of eye-traces on on-screen keyboards could be used for almost dwell timeout free gaze based communication. This could significantly increase the entry rate of eye-typing systems. 

Figures from this paper

Filteryedping: A Dwell-Free Eye Typing Technique
TLDR
This work demonstrates an eye typing technique, which does not require the user to dwell on the letters that she wants to input and automatically filters out unwanted letters from the sequence of letters gazed at while typing a word.
Filteryedping: Design Challenges and User Performance of Dwell-Free Eye Typing
TLDR
A dwell-free eye typing technique that filters out unintentionally selected letters from the sequence of letters looked at by the user and ranks possible words based on their length and frequency of use and suggests them to the user.
Robust Eye-Based Dwell-Free Typing
TLDR
A recognition approach for inferring the words which the user intends to type is proposed and it is suggested that this approach has better accuracy and more resilience to common text entry errors than other currently proposed dwell-free systems.
A Systematic Analysis of Literature on Dwell-Free Eye-Driven Typing
This paper provides the literature on the emerging technology of dwell-free eye-driven typing. Different techniques are discussed along with the advantages and disadvantages of each. Furthermore, it
EyeSwipe: text entry using gaze paths
TLDR
This thesis proposes EyeSwipe as a step further towards fast and comfortable text entry by gaze and proposes the use of gaze path data to dynamically adjust the gaze estimation during the interaction.
CamType: assistive text entry using gaze with an off-the-shelf webcam
TLDR
This article proposes a prototype of eye-typing system using an off-the-shelf webcam without the extra eye tracker, in which the appearance-based method is proposed to estimate people’s gaze coordinates on the screen based on the frontal face images captured by the webcam.
Algorithm for decoding visual gestures for an assistive virtual keyboard
TLDR
This work investigates methods of interaction based on eye movement tracking and presents a virtual keyboard that utilizes gaze detection as a text input and describes the development of the shape detection algorithm for the assistive keyboard, typed word voting from a Brazilian Portuguese lexicon and preliminary results on the decoding algorithm.

References

SHOWING 1-9 OF 9 REFERENCES
The potential of dwell-free eye-typing for fast assistive gaze communication
TLDR
It is found that after 40 minutes of practice, users reached a mean entry rate of 46 wpm, indicating that dwell-free eye-typing may be more than twice as fast as the current state-of-the-art methods for writing by gaze.
Longitudinal evaluation of discrete consecutive gaze gestures for text entry
TLDR
Subjective results indicate that participants consider Eye Write significantly faster, easier to use, and prone to cause less ocular fatigue than the on-screen keyboard, giving it practical advantages for eye-based text entry.
What you look at is what you get: eye movement-based interaction techniques
TLDR
Some of the human factors and technical considerations that arise in trying to use eye movements as an input medium are discussed and the first eye movement-based interaction techniques that are devised and implemented in the laboratory are described.
Fast gaze typing with an adjustable dwell time
TLDR
A longitudinal study to find out how fast novices learn to type by gaze using an adjustable dwell time found that the text entry rate increased and the dwell time decreased, but the error rates decreased.
Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze
TLDR
The results show that very high text entry rates can be achieved with eye-operated Dasher, but only after several hours of training.
An evaluation of an eye tracker as a device for computer input2
TLDR
The results show that an eye tracker can be used as a fast selection device providing that the target size is not too small and if the targets are small speed declines and errors increase rapidly.
SHARK2: a large vocabulary shorthand writing system for pen-based computers
TLDR
The architecture, algorithms and interfaces of a high-capacity multi-channel pen-gesture recognition system that supports a gradual and seamless transition from visually guided tracing to recall-based gesturing are designed and implemented.
Shorthand writing on stylus keyboard
TLDR
The key principles for the SHARK design include high efficiency stemmed from layout optimization, duality of gesturing and stylus tapping, scale and location independent writing, Zipf's law, and skill transfer from tapping to shorthand writing due to pattern consistency.
A General Method Applicable to the Search for Similarities in the Amino Acid Sequence of Two Proteins
A computer adaptable method for finding similarities in the amino acid sequences of two proteins has been developed. From these findings it is possible to determine whether significant homology