ReType: Quick Text Editing with Keyboard and Gaze

@article{Sindhwani2019ReTypeQT,
  title={ReType: Quick Text Editing with Keyboard and Gaze},
  author={Shyamli Sindhwani and Christof Lutteroth and Gerald Weber},
  journal={Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems},
  year={2019}
}
When a user needs to reposition the cursor during text editing, this is often done using the mouse. [] Key Result A comparative user study showed that ReType is able to match or even beat the speed of mouse-based interaction for small text edits. We conclude that the gaze-augmented user interface can make common interactions more fluent, especially for professional keyboard users.
Gaze'N'Touch: Enhancing Text Selection on Mobile Devices Using Gaze
TLDR
A novel concept that extends text editing with an additional modality, that is gaze called GazeButton can improve text selection by comparing it to touch based selection and results show that gaze based selection was faster with bigger text size, although not statistically significant.
TapGazer: Text Entry with Finger Tapping and Gaze-directed Word Selection
TLDR
This work presents TapGazer, a text entry system where users type by tapping their fingers in place, where users can tap anywhere as long as the identity of each tapping finger can be detected with sensors.
Improving the Multi-Modal Post-Editing (MMPE) CAT Environment based on Professional Translators' Feedback
TLDR
How MMPE was since further refined by professional translators’ judgments on which interaction modalities are most suitable for which PE task is described and more general qualitative findings are described.
The Effects of Predictive Features of Mobile Keyboards on Text Entry Speed and Errors
TLDR
This work presents a crowd-sourced mobile text entry study with 170 participants and simulates autocorrection and word prediction to capture user behaviours around these features, finding that using word prediction saves an average of 3.43 characters per phrase but also adds two seconds compared to actually typing the word, resulting in a negative effect on text entry speed.
Exploring Smartphone-enabled Text Selection in AR-HMD
TLDR
This paper explores the use of a smartphone as an input device to support text selection in AR-HMD because of its availability, familiarity, and social acceptability, and proposes four eyes-free text selection techniques, all using a smartphone — continuous touch, discrete touch, spatial movement, and raycasting.
Voice and Touch Based Error-tolerant Multimodal Text Editing and Correction for Smartphones
TLDR
VT combines touch and voice inputs with language context such as language model and phrase similarity to infer a user’s editing intention, which can handle ambiguities and noisy input signals and is a great advantage over the existing error correction methods.
EyeSayCorrect: Eye Gaze and Voice Based Hands-free Text Correction for Mobile Devices
TLDR
EyeSayCorrect, an eye gaze and voice based hands-free text correction method for mobile devices that uses a Bayesian approach for determining the selected word given an eye-gaze trajectory, is presented.
Intelligent Text Input Methods and Metrics
  • Computer Science
  • 2021
TLDR
This thesis aims to develop intelligent text input systems utilizing the state-of-the-art machine learning techniques, in order to support a comprehensive spectrum of text interactions, including entry, editing, and entry of special symbols such as emojis.
Leveraging Error Correction in Voice-based Text Entry by Talk-and-Gaze
TLDR
Talk-and-Gaze uses eye gaze to overcome the inability of voice-only systems to provide spatial information and was well received in a subjective assessment with 66% of users choosing it as their preferred choice for error correction in voice-based text entry.
Swap: A Replacement-based Text Revision Technique for Mobile Devices
TLDR
Results showed that Swap reduced efforts in caret control and repetitive backspace pressing during the text revision process, and it makes text revision rapid and intuitive.
...
1
2
...

References

SHOWING 1-10 OF 50 REFERENCES
Command Without a Click: Dwell Time Typing by Mouse and Gaze Selections
TLDR
Users can be productive from the first encounter with dwell time activation, but productivity depends on their familiarity with the input structure and the input mode (i.e. hand or eye).
Keyboard Surface Interaction: Making the keyboard into a pointing device
TLDR
Keyboard Surface Interaction (KSI), an interaction approach that turns the surface of a keyboard into an interaction surface and allows users to rest their hands on the keyboard at all times to minimize fatigue, is introduced.
Effective gazewriting with support of text copy and paste
  • Reo Kishi, Takahiro Hayashi
  • Computer Science
    2015 IEEE/ACIS 14th International Conference on Computer and Information Science (ICIS)
  • 2015
TLDR
An interface for text copy and paste into a gaze writing system that enables a user to specify a copy range and paste text to a desired position with only gaze interaction is introduced.
Eyes Only: Navigating Hypertext with Gaze
TLDR
The authors' results indicate that the Multiple Confirm alternative performed best among the gaze-based alternatives; it makes use of multiple confirmation buttons when letting the user choose between different options, hence improving accuracy.
Gaze typing compared with input by head and hand
This paper investigates the usability of gaze-typing systems for disabled people in a broad perspective that takes into account the usage scenarios and the particular users that these systems
EyePoint: practical pointing and selection using gaze and keyboard
TLDR
EyePoint uses a two-step progressive refinement process fluidly stitched together in a look-press-look-release action, which makes it possible to compensate for the accuracy limitations of the current state-of-the-art eye gaze trackers.
Gaze-enhanced user interface design
TLDR
This research explores how gaze information can be effectively used as an augmented input in addition to traditional input devices and proposes solutions which, as discovered over the course of the research, can be used to mitigate these issues.
Filteryedping: Design Challenges and User Performance of Dwell-Free Eye Typing
TLDR
A dwell-free eye typing technique that filters out unintentionally selected letters from the sequence of letters looked at by the user and ranks possible words based on their length and frequency of use and suggests them to the user.
Evaluation of eye gaze interaction
TLDR
Two experiments are presented that compare an interaction technique developed for object selection based on a where a person is looking with the most commonly used selection method using a mouse and find that the eye gaze interaction technique is faster than selection with a mouse.
EyeK: an efficient dwell-free eye gaze-based text entry system
TLDR
EyeK, a gaze-based text entry system which diminishes dwell time and favors to mitigate visual search time is proposed, which can effortlessly be suited in medium-sized display devices like Tablet PC, PDA etc.
...
1
2
3
4
5
...