Building a Personalized, Auto-Calibrating Eye Tracker from User Interactions

@article{Huang2016BuildingAP,
  title={Building a Personalized, Auto-Calibrating Eye Tracker from User Interactions},
  author={Michael Xuelin Huang and Tiffany C. K. Kwok and Grace Ngai and Stephen Chi-fai Chan and Hong Va Leong},
  journal={Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems},
  year={2016}
}
We present PACE, a Personalized, Automatically Calibrating Eye-tracking system that identifies and collects data unobtrusively from user interaction events on standard computing systems without the need for specialized equipment. PACE relies on eye/facial analysis of webcam data based on a set of robust geometric gaze features and a two-layer data validation mechanism to identify good training samples from daily interaction data. The design of the system is founded on an in-depth investigation… 

Figures and Tables from this paper

Quick Bootstrapping of a Personalized Gaze Model from Real-Use Interactions
TLDR
Fast-PACE is presented, an adaptation to PACE that exploits knowledge from existing data from different users to accelerate the learning speed of the personalized model, an adaptive, data-driven approach that continuously “learns” its user and recalibrates, adapts, and improves with additional usage by a user.
CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction
TLDR
CalibMe is introduced, a novel method that exploits collection markers (automatically detected fiducial markers) to allow eye tracker users to gather a large array of calibration points, remove outliers, and automatically reserve evaluation points in a fast and unsupervised manner.
Training Person-Specific Gaze Estimators from User Interactions with Multiple Devices
TLDR
This work presents the first method to train person-specific gaze estimators across multiple devices using a single convolutional neural network with shared feature extraction layers and device-specific branches that are trained from face images and corresponding on-screen gaze locations.
Towards End-to-end Video-based Eye-Tracking
TLDR
This work proposes a novel dataset and accompanying method and demonstrates that the fusion of information from visual stimuli as well as eye images can lead towards achieving performance similar to literature-reported figures acquired through supervised personalization.
WebGazer: Scalable Webcam Eye Tracking Using User Interactions
TLDR
The findings show that WebGazer can learn from user interactions and that its accuracy is sufficient for approximating the user's gaze.
Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications
TLDR
This work evaluates the performance of state-of-the-art appearance-based gaze estimation for interaction scenarios with and without personal calibration, indoors and outdoors, for different sensing distances, as well as for users with andwithout glasses.
Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery
TLDR
A novel method for eye contact detection that combines a state-of-the-art appearance-based gaze estimator with a novel approach for unsupervised gaze target discovery, i.e. without the need for tedious and time-consuming manual data annotation is presented.
The eye of the typer: a benchmark and analysis of gaze behavior during typing
TLDR
This work relates eye gaze with cursor activity, aligning both pointing and typing to eye gaze, and shows that incorporating typing behavior as a secondary signal improves eye tracking accuracy by 16% for touch typists, and 8% for non-touch typists.
Task-embedded online eye-tracker calibration for improving robustness to head motion
TLDR
An online calibration method to compensate for head movements if estimates of the gaze targets are available and it is reasonable to assume that for correct selections, the user's gaze target during the dwell-time was at the key center.
Reducing calibration drift in mobile eye trackers by exploiting mobile phone usage
TLDR
This work proposes two novel automatic recalibration methods that exploit mobile phone usage and builds saliency maps using the phone location in the egocentric view to identify likely gaze locations and uses the occurrence of touch events to recalibrate the eye tracker, thereby enabling privacy-preserving recalibration.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 36 REFERENCES
Building a Self-Learning Eye Gaze Model from User Interaction Data
TLDR
A set of robust geometric gaze features and a corresponding data validation mechanism that identifies good training data from noisy interaction-informed data collected in real-use scenarios and provides another level of crosschecking using previous good data is developed.
Calibration-Free Gaze Estimation Using Human Gaze Patterns
TLDR
This is the first work to use human gaze patterns in order to auto-calibrate gaze estimators based on gaze patterns obtained from other viewers, and the reported performance is lower than what could be achieved with dedicated hardware or calibrated setup.
Cleaning up systematic error in eye-tracking data by using required fixation locations
  • A. Hornof, T. Halverson
  • Sociology
    Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc
  • 2002
TLDR
The disparity can be used to monitor the deterioration in the accuracy of the eye tracker calibration and to automatically invoke a re-calibration procedure when necessary and to reduce the systematic error in the eye movement data collected for that participant.
An Incremental Learning Method for Unconstrained Gaze Estimation
TLDR
An online learning algorithm for gaze-based gaze estimation that allows free head movement in a casual desktop environment and a pose-based clustering approach that efficiently extends an appearance manifold model to handle the large variations of the head pose is presented.
Easy post-hoc spatial recalibration of eye tracking data
TLDR
A study shows that, at least in some cases, although the change in error across the display appears to be random it in fact follows a consistent pattern which can be modeled using quadratic equations.
User see, user point: gaze and cursor alignment in web search
TLDR
A search study is conducted to determine when gaze and cursor are aligned, and thus when the cursor position is a good proxy for gaze position, and improves the state-of-the-art technique for approximating visual attention with the cursor.
EyeTab: model-based gaze estimation on unmodified tablet computers
TLDR
EyeTab is presented, a model-based approach for binocular gaze estimation that runs entirely on an unmodified tablet and builds on set of established image processing and computer vision algorithms and adapts them for robust and near-realtime gaze estimation.
Mode-of-disparities error correction of eye-tracking data
TLDR
An error correction method that can reliably reduce systematic error and restore fixations to their true locations is proposed and it is shown that the method is reliable when the visual objects in the experiment are arranged in an irregular manner.
In the Eye of the Beholder: A Survey of Models for Eyes and Gaze
  • D. Hansen, Q. Ji
  • Computer Science
    IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2010
TLDR
This review shows that, despite their apparent simplicity, the development of a general eye detection technique involves addressing many challenges, requires further theoretical developments, and is consequently of interest to many other domains problems in computer vision and beyond.
Nonlinear Eye Gaze Mapping Function Estimation via Support Vector Regression
TLDR
Experiments for multiple users show that eye gaze can be estimated accurately under natural head movement via the proposed technique, and can be used by other users via a simple personal adaptation without retraining.
...
1
2
3
4
...