Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements

@article{Esteves2015OrbitsGI,
  title={Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements},
  author={Augusto Esteves and Eduardo Velloso and Andreas Bulling and Hans-Werner Gellersen},
  journal={Proceedings of the 28th Annual ACM Symposium on User Interface Software \& Technology},
  year={2015}
}
We introduce Orbits, a novel gaze interaction technique that enables hands-free input on smart watches. The technique relies on moving controls to leverage the smooth pursuit movements of the eyes and detect whether and at which control the user is looking at. In Orbits, controls include targets that move in a circular trajectory in the face of the watch, and can be selected by following the desired one for a small amount of time. We conducted two user studies to assess the technique's… 

Figures from this paper

Feedback for Smooth Pursuit Gaze Tracking Based Control
TLDR
Comparing feedback modalities (visual, auditory, haptic, none, none) in a continuous adjustment technique for smooth pursuit gaze tracking shows clear user preference and acceptability for haptic and audio feedback.
SPOCK: A Smooth Pursuit Oculomotor Control Kit
TLDR
This work presents SPOCK, a novel gaze interaction method based on smooth pursuit eye movements requiring only minimal extensions to button-based interfaces, and evaluated it against dwell time and a more challenging multiple-choice scenario.
Pursuits-based Gaze Interaction using Real World Elements as Stimuli
TLDR
This work explores the potential and feasibility of Pursuits-based gaze interaction with moving stimuli in the form of real world elements and designs a usability study in the real world, where three scenarios on user acceptance and Pursuits performance are tested.
EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays
TLDR
EyeScout is presented, an active eye tracking system that combines an eye tracker mounted on a rail system with a computational method to automatically detect and align the tracker with the user's lateral movement, and is well perceived by users.
Exploration of smooth pursuit eye movements for gaze calibration in games
TLDR
This work tried to understand the relationship between gaze and motion when performing smooth pursuit movements through the integration of calibration within a videogame and proposed to leverage the attentive gaze behavior of the eyes during gameplay for implicit and continuous re-calibration.
GazeEverywhere: Enabling Gaze-only User Interaction on an Unmodified Desktop PC in Everyday Scenarios
TLDR
The GazeEverywhere solution that can replace the mouse with gaze control by adding a transparent layer on top of the system GUI is presented, and it is shown that users were able to browse the internet and successfully run Wikirace using gaze only, without any plug-ins or other modifications.
Calibration-free gaze interfaces based on linear smooth pursuit
TLDR
Results show that the number and speed of the displayed objects influence users’ performance with the interface, and can help to enable a calibration-free accessible interaction with gaze interfaces.
Smooth eye movement interaction using EOG glasses
TLDR
A pilot study suggests that the eye movement required for Orbits can be sensed using three electrodes: one in the nose bridge and one in each nose pad, and combined with a gyro to create an interface with a maximum input rate of 5.0 bps.
Eye Movements and Human-Computer Interaction
TLDR
This chapter gives an introduction to using gaze as an input method and shows how to use it as an explicit control method and how to exploit it subtly in the background as an additional information channel.
EyeFlow: pursuit interactions using an unmodified camera
TLDR
The smooth pursuit eye movement based interaction using an unmodified off-the-shelf RGB camera is investigated using EyeFlow, which does not call for either a 3D pupil model or 2D pupil detection to track the pupil center location.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 39 REFERENCES
Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets
TLDR
The results show that Pursuits is a versatile and robust technique and that users can interact with Pursuits-based interfaces without prior knowledge or preparation phase.
Entering PIN codes by smooth pursuit eye movements
TLDR
Two experiments are described which use smooth pursuit eye movements on moving display buttons to extract an easy and fast interaction concept and at the same time to collect data to develop a specific but robust algorithm.
SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements
TLDR
The first gaze speller explicitly utilizing smooth pursuit eye movements and their particular characteristics is developed, which achieves sufficient accuracy with a one-point calibration and does not require extensive training.
Evaluation of eye gaze interaction
TLDR
Two experiments are presented that compare an interaction technique developed for object selection based on a where a person is looking with the most commonly used selection method using a mouse and find that the eye gaze interaction technique is faster than selection with a mouse.
SideWays: a gaze interface for spontaneous interaction with situated displays
TLDR
SideWays is presented, a novel person-independent eye gaze interface that supports spontaneous interaction with displays: users can just walk up to a display and immediately interact using their eyes, without any prior user calibration or training.
Pursuit calibration: making gaze calibration less tedious and more flexible
TLDR
This work presents pursuit calibration, a novel approach that, unlike existing methods, is able to detect the user's attention to a calibration target, by using moving targets, and correlation of eye movement and target trajectory, implicitly exploiting smooth pursuit eye movement.
Watchit: simple gestures and eyes-free interaction for wristwatches and bracelets
TLDR
WatchIt is a prototype device that extends interaction beyond the watch surface to the wristband, and two interaction techniques for command selection and execution, and a novel gesture technique and an adaptation of an existing menu technique suitable for wristband interaction are proposed.
Manual and gaze input cascaded (MAGIC) pointing
TLDR
This work explores a new direction in utilizing eye gaze for computer input by proposing an alternative approach, dubbed MAGIC (Manual And Gaze Input Cascaded) pointing, which might offer many advantages, including reduced physical effort and fatigue as compared to traditional manual pointing, greater accuracy and naturalness than traditional gaze pointing, and possibly fasterspeed than manual pointing.
Analysing EOG signal features for the discrimination of eye movements with wearable devices
TLDR
A set of basic signal features are developed that are extracted from the collected eye movement data and show that a feature-based approach has the potential to discriminate between saccades, smooth pursuits, and vestibulo-ocular reflex movements.
Gaze input for mobile devices by dwell and gestures
TLDR
In an experiment with 11 subjects, gaze interaction was found to have a lower performance than touch interaction but comparable to the error rate and completion time of accelerometer (i.e. tilt) interaction, suggesting that this method may be the best option for hands-free gaze control of smartphones.
...
1
2
3
4
...