Optical Gaze Tracking with Spatially-Sparse Single-Pixel Detectors

@article{Li2020OpticalGT,
  title={Optical Gaze Tracking with Spatially-Sparse Single-Pixel Detectors},
  author={Richard Li and Eric Whitmire and Michael Stengel and Ben Boudaoud and Jan Kautz and David P. Luebke and Shwetak N. Patel and Kaan Aksit},
  journal={2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
  year={2020},
  pages={117-126}
}
  • Richard Li, Eric Whitmire, K. Aksit
  • Published 15 September 2020
  • Computer Science
  • 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
Gaze tracking is an essential component of next generation displays for virtual reality and augmented reality applications. Traditional camera-based gaze trackers used in next generation displays are known to be lacking in one or multiple of the following metrics: power consumption, cost, computational complexity, estimation accuracy, latency, and form-factor. We propose the use of discrete photodiodes and light-emitting diodes (LEDs) as an alternative to traditional camera-based gaze tracking… 

Figures and Tables from this paper

Eye-Tracking Beyond Peripersonal Space in Virtual Reality: Validation and Best Practices
TLDR
It is concluded that vergence is a potentially problematic basis for estimating gaze depth in VR and should be used with caution as the field move towards a more established method for 3D eye-tracking.
Head and Gaze Orientation in Hemispheric Image Viewing
Head mounted displays provide a good platform for viewing of immersive 360° or hemispheric images. A person can observe an image all around, just by turning his/her head and looking at different
Learned holographic light transport
TLDR
This work addresses this mismatch by learning the holographic light transport in holographic displays by using a camera and a holographic display to capture the image reconstructions of optimized holograms that rely on ideal simulations to generate a dataset.
Metameric Varifocal Holograms
TLDR
A new CGH method is proposed that exploits gaze-contingency and perceptual graphics to accelerate the development of practical holographic display systems and prioritises and improves foveal visual quality without causing perceptually visible distortions at the periphery.
BinoVFAR: An Efficient Binocular Visual Field Assessment Method using Augmented Reality Glasses
TLDR
A robust binocular visual field assessment method based on novel Augmented Reality (AR) glasses is presented, namely, BinoVFAR that can simultaneously find the VF of both eyes.
Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz
TLDR
This work proposes a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions.
Learned holographic light transport
TLDR
This work addresses this mismatch by learning the holographic light transport in holographic displays by using a camera and a holographic display to capture the image reconstructions of optimized holograms that rely on ideal simulations to generate a dataset.
Metameric Varifocal Holography
TLDR
A new CGH method is proposed that exploits gaze-contingency and perceptual graphics to accelerate the development of practical holographic display systems and prioritises and improves foveal visual quality without causing perceptually visible distortions at the periphery.
Pre-proof An integrative view of foveated rendering
TLDR
An up-to-date integrative view of the domain from the point of v rendering methods employed, discussing general characteristics, comm differences, advantages, and limitations and the main areas where foveated rendering is already in use today.
...
1
2
...

References

SHOWING 1-10 OF 78 REFERENCES
InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation
TLDR
InvisibleEye is presented, a novel approach for mobile eye tracking that uses millimetre-size RGB cameras that can be fully embedded into normal glasses frames that achieves a top person-specific gaze estimation accuracy of 1.79° using four cameras with a resolution of only 5 × 5 pixels.
Ultra-Low Power Gaze Tracking for Virtual Reality
TLDR
LiGaze is a low- cost, low-power approach to gaze tracking tailored to VR that relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters.
Making stand-alone PS-OG technology tolerant to the equipment shifts
TLDR
A simple multi-layer perceptron neural network is employed to map raw sensor data to gaze locations and report its performance for shift compensation in PS-OG, which allows eye-tracking with high sampling rate and low power consumption.
A Low-Computational Approach on Gaze Estimation With Eye Touch System
TLDR
The results of the experimental analysis over numerous subjects clearly indicate that the proposed eye tracking system can classify eye winks with 98% accuracy, and attain an accurate gaze direction with an average angular error of about 0.93 °.
Learning an appearance-based gaze estimator from one million synthesised images
TLDR
The UnityEyes synthesis framework combines a novel generative 3D model of the human eye region with a real-time rendering framework and shows that these synthesized images can be used to estimate gaze in difficult in-the-wild scenarios, even for extreme gaze angles.
Rendering of Eyes for Eye-Shape Registration and Gaze Estimation
TLDR
The benefits of the synthesized training data (SynthesEyes) are demonstrated by out-performing state-of-the-art methods for eye-shape registration as well as cross-dataset appearance-based gaze estimation in the wild.
Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction
TLDR
Pupil is an accessible, affordable, and extensible open source platform for pervasive eye tracking and gaze-based interaction and includes state-of-the-art algorithms for real-time pupil detection and tracking, calibration, and accurate gaze estimation.
Foveated AR
TLDR
This work presents a near-eye augmented reality display with resolution and focal depth dynamically driven by gaze tracking, and shows prototypes supporting 30, 40 and 60 cpd foveal resolution at a net 85° × 78° field of view per eye.
NVGaze: An Anatomically-Informed Dataset for Low-Latency, Near-Eye Gaze Estimation
TLDR
This work creates a synthetic dataset using anatomically-informed eye and face models with variations in face shape, gaze direction, pupil and iris, skin tone, and external conditions, and trains neural networks performing with sub-millisecond latency.
Battery-Free Eye Tracker on Glasses
TLDR
A battery-free wearable eye tracker that achieves sub-millimeter tracking accuracy at high tracking rates and exploits characteristics of different eye movement stages and adjusts its sensing and computation accordingly for further energy savings is proposed.
...
1
2
3
4
5
...