Multimodal Intelligent Eye-Gaze Tracking System

@article{Biswas2015MultimodalIE,
  title={Multimodal Intelligent Eye-Gaze Tracking System},
  author={Pradipta Biswas and Patrick Langdon},
  journal={International Journal of Human-Computer Interaction},
  year={2015},
  volume={31},
  pages={277 - 294}
}
  • P. Biswas, P. Langdon
  • Published 3 February 2015
  • Computer Science
  • International Journal of Human-Computer Interaction
This article presents a series of user studies to develop a new eye-gaze tracking–based pointing system. We developed a new target prediction model that works for different input modalities and combined the eye-gaze tracking–based pointing with a joystick controller that can reduce pointing and selection times. The system finds important applications in cockpit of combat aircraft and for computer novice users. User studies confirmed that users can perform significantly faster using this new eye… 
Etracker: A Mobile Gaze-Tracking System with Near-Eye Display Based on a Combined Gaze-Tracking Algorithm
TLDR
A complete prototype of the mobile gaze-tracking system ‘Etracker’ with a near-eye viewing device for human gaze tracking is constructed and a combined gaze- tracking algorithm is proposed, using the mean value of gazes to resolve pupil center changes caused by nystagmus in calibration algorithms.
How Does Eye-Gaze Relate to Gesture Movement in an Automotive Pointing Task?
TLDR
The response time and misalignment between the motor and visual attention during the pointing-selection tasks are considered and are shown to be highly affected by the driving conditions.
Operating Different Displays in Military Fast Jets Using Eye Gaze Tracker
TLDR
The studies found that the gaze-controlled interface statistically significantly increased the speed of interaction for secondary mission control tasks compared to touchscreenand joystick-based target designation system.
Comparing Ocular Parameters for Cognitive Load Measurement in Eye-Gaze-Controlled Interfaces for Automotive and Desktop Computing Environments
TLDR
It was found that average velocity of a particular type of microsaccadic eye movement called Saccadic Intrusion is most indicative of users’ cognitive load compared to pupil dilation and eye-blink-based parameters.
Adaptive calibration method based on state space model for eye gaze HCI system
  • Jiao Xu, Qijie Zhao
  • Computer Science
    2017 4th International Conference on Systems and Informatics (ICSAI)
  • 2017
TLDR
Results show that the proposed method of adaptive gaze-tracking human-computer interaction based on the state space transformation has a good adaptability, which improves the problem of frequent system calibration without interruption during the HCI process.
Multimodal Gaze Controlled Dashboard
TLDR
This paper explores use of eye gaze tracking as a direct controller of electronic displays inside a car and analysing drivers' cognitive load by proposing and validated new multimodal fusion algorithms involving eye gaze and finger tracking systems.
Eye Gaze Controlled Projected Display in Automotive and Military Aviation Environments
TLDR
Four user studies involving driving and flight simulators have found that the proposed projected display can improve driving and flying performance and significantly reduce pointing and selection times for secondary mission control tasks compared to existing interaction systems.
Analysing Ocular Parameters of Users with Cerebral Palsy for Developing Gaze Controlled Interface
Purpose: To investigate response to visual stimuli and visual search patterns of users with cerebral palsy and using that information to facilitate developing eye gaze controlled interaction system.
...
...

References

SHOWING 1-10 OF 59 REFERENCES
A new input system for disabled users involving eye gaze tracker and scanning interface
TLDR
A new input interaction system for people with severe disabilities that combines eye gaze tracking and single switch scanning interaction techniques in a unique way which is faster than only scanning‐based systems while more comfortable to use than only eye gaze Tracking-based systems.
Remote Eye Gaze Tracking System as a Computer Interface for Persons with Severe Motor Disability
TLDR
The novelty of this study resides in the integration of several procedures, such as: real time improvement of the eye-to-mouse-pointer coordinate conversion mechanism, determination of a practical solution to the mouse click operations, and development of effective means to monitor and evaluate the system performance.
Understanding users and their needs
TLDR
Findings from gaze-control user trials involving users from both groups: people who are totally paralyzed, as well as people with a wide range of complex disabilities are reported.
Multimodal target prediction model
TLDR
A Neural Network based model that can be used to predict pointing target for both physical and situational impairment and can accurately predict target in all cases is presented.
Application of Fitts' law to eye gaze interaction
TLDR
Fitts' Law model was shown to predict movement times using both interaction techniques equally well and is seen to be a potential contributor to design of modern multimodal human-computer interfaces.
Manual and gaze input cascaded (MAGIC) pointing
TLDR
This work explores a new direction in utilizing eye gaze for computer input by proposing an alternative approach, dubbed MAGIC (Manual And Gaze Input Cascaded) pointing, which might offer many advantages, including reduced physical effort and fatigue as compared to traditional manual pointing, greater accuracy and naturalness than traditional gaze pointing, and possibly fasterspeed than manual pointing.
SideWays: a gaze interface for spontaneous interaction with situated displays
TLDR
SideWays is presented, a novel person-independent eye gaze interface that supports spontaneous interaction with displays: users can just walk up to a display and immediately interact using their eyes, without any prior user calibration or training.
An evaluation of an eye tracker as a device for computer input2
TLDR
The results show that an eye tracker can be used as a fast selection device providing that the target size is not too small and if the targets are small speed declines and errors increase rapidly.
Combined head and eye tracking system for dynamic testing of the vestibular system
TLDR
A combined head-eye tracking system suitable for use with free head movement during natural activities and providing an integrated head and eye position measurement while allowing for a large range of head movement.
Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces
TLDR
This chapter describes research at NRL on developing interaction techniques that incorporate eye movements into the user-computer dialogue in a convenient and natural way, and considers eye movement-based interaction as an exemplar of a new, more general class of non-command-based user- computer interaction.
...
...