Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality

@article{Kyt2018PinpointingPH,
  title={Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality},
  author={Mikko Kyt{\"o} and Barrett Ens and Thammathip Piumsomboon and Gun A. Lee and Mark Billinghurst},
  journal={Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems},
  year={2018}
}
Head and eye movement can be leveraged to improve the user's interaction repertoire for wearable displays. Head movements are deliberate and accurate, and provide the current state-of-the-art pointing technique. Eye gaze can potentially be faster and more ergonomic, but suffers from low accuracy due to calibration errors and drift of wearable eye-tracking sensors. This work investigates precise, multimodal selection techniques using head motion and eye gaze. A comparison of speed and pointing… 
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
TLDR
This work proposes to leverage the synergetic movement of eye and head, and identifies design principles for Eye&Head gaze interaction, to enable dynamic coupling of gaze and pointer, hover interaction, visual exploration around pre-selections, and iterative and fast confirmation of targets.
Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality
TLDR
A study of gaze shifts in virtual reality aimed to address the gap and inform design, and argue to treat gaze as multimodal input, and eye, head and body movement as synergetic in interaction design.
BimodalGaze: Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement
TLDR
BimodalGaze is introduced, a novel technique for seamless head-based refinement of a gaze cursor that leverages eye-head coordination insights to separate natural from gestural head movement.
Radi-Eye: Hands-Free Radial Interfaces for 3D Interaction using Gaze-Activated Head-Crossing
TLDR
The results show that Radi-Eye provides users with fast and accurate input while opening up a new design space for hands-free fluid interaction, and the effect of radial interface scale and orientation on performance with Look & Cross is evaluated.
The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended Reality
With innovations in the field of gaze and eye tracking, a new concentration of research in the area of gaze-tracked systems and user interfaces has formed in the field of Extended Reality (XR). Eye
Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR
TLDR
A text entry user study shows that one alignment technique reduces physical finger movement by more than half compared to standard in-air finger typing, and is faster and exhibits less perceived eye fatigue than an eyes-only dwell-time technique.
Comparison of Eye-Based and Controller-Based Selection in Virtual Reality
TLDR
The Fitts’ modeling of the eye-based selection in a virtual reality environment with controller-based input is explored, providing the baseline for two types of eye- based interaction (dwell and physical trigger) in both three-dimensional and two-dimensional environment.
EyePointing: A Gaze-Based Selection Technique
TLDR
EyePointing is proposed, a technique which combines the MAGIC pointing technique and the referential mid-air pointing gesture to selecting objects in a distance and while the eye gaze is used for referencing the object, the pointing gesture is used as a trigger.
Accurate Real‐time 3D Gaze Tracking Using a Lightweight Eyeball Calibration
TLDR
A novel lightweight eyeball calibration scheme that determines the user‐specific visual axis, eyeball size and position in the head and achieves state‐of‐the‐art accuracy in gaze angle estimation is presented.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 73 REFERENCES
Exploring natural eye-gaze-based interaction for immersive virtual reality
TLDR
This paper explores three novel eye-gaze-based interaction techniques: Duo-Reticles, eye- gaze selection based on eye-Gaze and inertial reticles, Radial Pursuit, cluttered-object selection that takes advantage of smooth pursuit, and Nod and Roll, head-gesture- based interaction based on the vestibulo-ocular reflex.
Look and lean: accurate head-assisted eye pointing
TLDR
It is concluded that head assisted eye pointing is a comfortable and potentially very efficient alternative for other assisting methods in the eye pointing, such as zooming.
Head and Eye Movement as Pointing Modalities for Eyewear Computers
TLDR
Using head and eye movements to point on a graphical user interface of a wearable computer showed that the eye pointing is significantly faster than head or mouse pointing, however, the participants thought that the head pointing is more accurate and convenient.
SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality
TLDR
Results indicate show SmoothMoves is viable, efficient and immediately available for a wide range of wearable devices that feature embedded motion sensing.
Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets
TLDR
This work has carefully elaborated two novel and consistent sets of gaze-supported interaction techniques based on touch-enhanced gaze pointers and local magnification lenses that allow for fluently selecting and positioning distant targets.
Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency
TLDR
A method to automatically self-calibrate head-mounted eye trackers based on a computational model of bottom-up visual saliency that can achieve gaze estimation accuracy competitive with that of a calibrated eye tracker, without any manual calibration.
Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design
TLDR
It is found that accuracy and precision can vary between users and targets more than six-fold, and report on differences between lighting, trackers, and screen regions.
Enhanced gaze interaction using simple head gestures
TLDR
A combination of gaze pointing and head gestures for enhanced hands-free interaction could potentially provide a natural and more accurate interaction method for multimodal games or transient interactions in pervasive and mobile environments.
Look & touch: gaze-supported target acquisition
TLDR
A set of novel and practical gaze-supported selection techniques for distant displays designed according to the principle gaze, which include an enhanced gaze-directed cursor, local zoom lenses and more elaborated techniques utilizing manual fine positioning of the cursor via touch.
The eyes don't have it: an empirical comparison of head-based and eye-based selection in virtual reality
TLDR
A study comparing selection performance between three eye/head interaction techniques using the recently released FOVE head-mounted display (HMD), which offers an integrated eye tracker, indicates that eye-only selection offered the worst performance in terms of error rate, selection times, and throughput.
...
1
2
3
4
5
...