• Corpus ID: 213005692

MagicEyes: A Large Scale Eye Gaze Estimation Dataset for Mixed Reality

@article{Wu2020MagicEyesAL,
  title={MagicEyes: A Large Scale Eye Gaze Estimation Dataset for Mixed Reality},
  author={Zhengyang Wu and Srivignesh Rajendran and Tarrence van As and Joelle Zimmermann and Vijay Badrinarayanan and Andrew Rabinovich},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.08806}
}
With the emergence of Virtual and Mixed Reality (XR) devices, eye tracking has received significant attention in the computer vision community. Eye gaze estimation is a crucial component in XR -- enabling energy efficient rendering, multi-focal displays, and effective interaction with content. In head-mounted XR devices, the eyes are imaged off-axis to avoid blocking the field of view. This leads to increased challenges in inferring eye related quantities and simultaneously provides an… 

Figures and Tables from this paper

A dataset of eye gaze images for calibration-free eye tracking augmented reality headset

The ARGaze dataset is presented, a dataset with 1,321,968 pairs of eye gaze images at 32 × 32 pixel resolution and 50 corresponding videos of world views based on a replicable augmented reality headset that achieves record low gaze estimation error.

TEyeD: Over 20 Million Real-World Eye Images with Pupil, Eyelid, and Iris 2D and 3D Segmentations, 2D and 3D Landmarks, 3D Eyeball, Gaze Vector, and Eye Movement Types

The world’s largest unified public data set of eye images taken with head-mounted devices, TEyeD provides a unique, coherent resource and a valuable foundation for advancing research in the field of computer vision, eye tracking and gaze estimation in modern VR and AR applications.

EyeNeRF: A Hybrid Representation for Photorealistic Synthesis, Animation and Relighting of Human Eyes

This novel hybrid model has been designed specifically to address the various parts of that exceptionally challenging facial area - the explicit eyeball surface allows modeling refraction and high frequency specular reflection at the cornea, whereas the implicit representation is well suited to model lower frequency skin reflection via spherical harmonics.

Improving the Deeplabv3+ Model with Attention Mechanisms Applied to Eye Detection and Segmentation

A dataset to reduce elaboration for chipping eye images and denoting labels is established, and the result shows that the IDLN model achieves the appropriate segmentation accuracy for both eye images, while the UNet and ISANet models show the best results.

Neural Network Based Eyetracking System

The goal of this thesis is to create a framework for training eye-tracking models using neural networks designed for regression and common image processing techniques, and to compare several integrated neural network architectures in terms of their accuracy and speed.

Gaze Estimation Based on Multi-view Geometric Neural Networks

v TABLE OF CONTENTS vi LIST OF FIGURES ix LIST OF TABLES xi CHAPTER-

References

SHOWING 1-10 OF 40 REFERENCES

MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation

It is shown that image resolution and the use of both eyes affect gaze estimation performance, while head pose and pupil centre information are less informative, and GazeNet is proposed, the first deep appearance-based gaze estimation method.

InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation

InvisibleEye is presented, a novel approach for mobile eye tracking that uses millimetre-size RGB cameras that can be fully embedded into normal glasses frames that achieves a top person-specific gaze estimation accuracy of 1.79° using four cameras with a resolution of only 5 × 5 pixels.

NVGaze: An Anatomically-Informed Dataset for Low-Latency, Near-Eye Gaze Estimation

This work creates a synthetic dataset using anatomically-informed eye and face models with variations in face shape, gaze direction, pupil and iris, skin tone, and external conditions, and trains neural networks performing with sub-millisecond latency.

Rendering of Eyes for Eye-Shape Registration and Gaze Estimation

The benefits of the synthesized training data (SynthesEyes) are demonstrated by out-performing state-of-the-art methods for eye-shape registration as well as cross-dataset appearance-based gaze estimation in the wild.

Gaze locking: passive eye contact detection for human-object interaction

This work proposes a passive, appearance-based approach for sensing eye contact in an image by focusing on gaze *locking* rather than gaze tracking, and demonstrates how this method facilitates human-object interaction, user analytics, image filtering, and gaze-triggered photography.

Learning-by-Synthesis for Appearance-Based 3D Gaze Estimation

This paper presents a learning-by-synthesis approach to accurate image-based gaze estimation that is person- and head pose-independent and outperforms existing methods that use low-resolution eye images.

EyeTab: model-based gaze estimation on unmodified tablet computers

EyeTab is presented, a model-based approach for binocular gaze estimation that runs entirely on an unmodified tablet and builds on set of established image processing and computer vision algorithms and adapts them for robust and near-realtime gaze estimation.

Novel Eye Gaze Tracking Techniques Under Natural Head Movement

  • Zhiwei ZhuQ. Ji
  • Computer Science
    IEEE Transactions on Biomedical Engineering
  • 2007
Two novel solutions to allow natural head movement and minimize the calibration procedure to only one time for a new individual are proposed and represent an important step for the eye tracker to be accepted as a natural computer input device.

Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments

Labelled pupils in the wild (LPW), a novel dataset of 66 high-quality, high-speed eye region videos for the development and evaluation of pupil detection algorithms, provides valuable insights into the general pupil detection problem and allows us to identify key challenges for robust pupil detection on head-mounted eye trackers.