Using Variable Dwell Time to Accelerate Gaze-Based Web Browsing with Two-Step Selection

@article{Chen2019UsingVD,
  title={Using Variable Dwell Time to Accelerate Gaze-Based Web Browsing with Two-Step Selection},
  author={Zhaokang Chen and Bertram E. Shi},
  journal={International Journal of Human–Computer Interaction},
  year={2019},
  volume={35},
  pages={240 - 255}
}
ABSTRACTIn order to avoid the “Midas Touch” problem, gaze-based interfaces for selection often introduce a dwell time: a fixed amount of time the user must fixate upon an object before it is selected. [] Key Method In the first step, a command (e.g., “back” or “select”) is chosen from a menu using a dwell time that is constant across the different commands. In the second step, if the “select” command is chosen, the user selects a hyperlink using a dwell time that varies between different hyperlinks.
Dynamic Bayesian Adjustment of Dwell Time for Faster Eye Typing
TLDR
This work proposes to speed up eye typing while maintaining low error by dynamically adjusting the dwell time for each letter based on the past input history, which enables it to assign dwell times using a principled model that requires only a few free parameters.
GazeWheels: Comparing Dwell-time Feedback and Methods for Gaze Input
TLDR
An evaluation and comparison of GazeWheels: techniques for dwell time gaze input and feedback and results show that Infinite GazeWheel and Pause-and-Resume GazesWheel are more error prone but significantly faster than Resetting Gaze wheel when using 800-1000 ms dwell time, even when including the time for correcting errors.
Gaze-based Cursor Control Impairs Performance in Divided Attention
TLDR
Evidence is provided that the adoption of interfaces controlled by human eye-gaze in cognitively demanding environments require careful design, proper testing and sufficient user training.
Cursor Click Modality in an Accelerometer-Based Computer Access Device
TLDR
Surface electromyography-based click modalities were as fast as the shortest dwell time and as accurate as the longest dwell time, and also minimized user effort, which is similar in individuals with neuromuscular disorders.
Offset Calibration for Appearance-Based Gaze Estimation via Gaze Decomposition
TLDR
This work proposes a gaze decomposition method that enables low complexity calibration, i.e., using calibration data collected when subjects view only one or a few gaze targets and the number of images per gaze target is small, to improve estimation.
Task-embedded online eye-tracker calibration for improving robustness to head motion
TLDR
An online calibration method to compensate for head movements if estimates of the gaze targets are available and it is reasonable to assume that for correct selections, the user's gaze target during the dwell-time was at the key center.
Appearance Based Gaze Estimation Using Eye Region Landmarks and Math Approach
TLDR
This work tries to extract several effective landmarks around the eyeball and iris from the monocular input for gaze estimation by using a novel learning-based method and a new and effective formula for drawing more accurate gaze directions.
Appearance-Based Gaze Estimation Using Dilated-Convolutions
TLDR
This work adopts dilated-convolutions to extract high-level features without reducing spatial resolution in gaze estimation and achieves state-of-the-art results on both the Columbia Gaze and the MPIIGaze datasets.
Integrated Head-Tilt and Electromyographic Cursor Control
TLDR
The findings of this study show that the ACC/sEMG system is an effective computer access method across different lighting conditions and computer orientations, however, there is a tradeoff between speed and accuracy:ACC/s EMG system provided higher target selection accuracy compared to Camera Mouse, while the latter provided faster target selection.
Unsupervised Outlier Detection in Appearance-Based Gaze Estimation
TLDR
This work proposes an algorithm that detects outliers without supervision based on the input images with only gaze labels, which alleviates the impact of outliers during learning and results in a better gaze estimator.
...
...

References

SHOWING 1-10 OF 34 REFERENCES
Probabilistic adjustment of dwell time for eye typing
  • Jimin Pi, Bertram E. Shi
  • Computer Science
    2017 10th International Conference on Human System Interactions (HSI)
  • 2017
TLDR
A probabilistic model for gaze based selection is proposed, which adjusts the dwell time based on the probability of each letterbased on the past letters selected, which can be generalized to other dwell-based applications, leading to more efficient gaze system interaction.
Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times
TLDR
A novel approach to dwell-based eye typing that dynamically adjusts the dwell time of keys in an on-screen keyboard based on the likelihood that a key will be selected next, and the location of the key on the keyboard is presented.
Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative
TLDR
Actigaze is presented, a new gaze-only click alternative which is fast and accurate for point-and-click tasks, and two variants of the proposed click alternative are evaluated, comparing them against the mouse and another gaze- only click alternative.
Fast gaze typing with an adjustable dwell time
TLDR
A longitudinal study to find out how fast novices learn to type by gaze using an adjustable dwell time found that the text entry rate increased and the dwell time decreased, but the error rates decreased.
Gaze dependant prefetching of web content to increase speed and comfort of web browsing
Gaze-enhanced user interface design
TLDR
This research explores how gaze information can be effectively used as an augmented input in addition to traditional input devices and proposes solutions which, as discovered over the course of the research, can be used to mitigate these issues.
Gaze-Assisted User Intention Prediction for Initial Delay Reduction in Web Video Access
TLDR
A threaded interaction model is introduced and applied to user intention prediction for initial delay reduction in web video access, demonstrating significant improvement of accuracy and advance time in intention prediction.
GazeTheWeb: A Gaze-Controlled Web Browser
TLDR
GazeTheWeb is a Web browser accessible solely by eye gaze input that effectively supports all browsing operations like search, navigation and bookmarks and is based on a Chromium powered framework.
Intelligent gaze-added interfaces
TLDR
A standard WIMP operating-system interface was extended into a new interface, IGO, that incorporates intelligent gaze-added input and it was found that users quickly adapted to the new interface and utilized gaze effectively both alone and with other inputs.
Efficient eye pointing with a fisheye lens
TLDR
This paper uses a fisheye lens and a video-based eye tracker to locally magnify the display at the point of the user's gaze to facilitate eye pointing and selection of magnified (expanded) targets.
...
...