Multimodal Intelligent Eye-Gaze Tracking System

  title={Multimodal Intelligent Eye-Gaze Tracking System},
  author={Pradipta Biswas and Patrick Langdon},
  journal={International Journal of Human-Computer Interaction},
  pages={277 - 294}
  • P. Biswas, P. Langdon
  • Published 3 February 2015
  • Computer Science
  • International Journal of Human-Computer Interaction
This article presents a series of user studies to develop a new eye-gaze tracking–based pointing system. We developed a new target prediction model that works for different input modalities and combined the eye-gaze tracking–based pointing with a joystick controller that can reduce pointing and selection times. The system finds important applications in cockpit of combat aircraft and for computer novice users. User studies confirmed that users can perform significantly faster using this new eye… 
Etracker: A Mobile Gaze-Tracking System with Near-Eye Display Based on a Combined Gaze-Tracking Algorithm
A complete prototype of the mobile gaze-tracking system ‘Etracker’ with a near-eye viewing device for human gaze tracking is constructed and a combined gaze- tracking algorithm is proposed, using the mean value of gazes to resolve pupil center changes caused by nystagmus in calibration algorithms.
How Does Eye-Gaze Relate to Gesture Movement in an Automotive Pointing Task?
The response time and misalignment between the motor and visual attention during the pointing-selection tasks are considered and are shown to be highly affected by the driving conditions.
Operating Different Displays in Military Fast Jets Using Eye Gaze Tracker
The studies found that the gaze-controlled interface statistically significantly increased the speed of interaction for secondary mission control tasks compared to touchscreenand joystick-based target designation system.
Comparing Ocular Parameters for Cognitive Load Measurement in Eye-Gaze-Controlled Interfaces for Automotive and Desktop Computing Environments
It was found that average velocity of a particular type of microsaccadic eye movement called Saccadic Intrusion is most indicative of users’ cognitive load compared to pupil dilation and eye-blink-based parameters.
Adaptive calibration method based on state space model for eye gaze HCI system
  • Jiao Xu, Qijie Zhao
  • Computer Science
    2017 4th International Conference on Systems and Informatics (ICSAI)
  • 2017
Results show that the proposed method of adaptive gaze-tracking human-computer interaction based on the state space transformation has a good adaptability, which improves the problem of frequent system calibration without interruption during the HCI process.
Multimodal Gaze Controlled Dashboard
This paper explores use of eye gaze tracking as a direct controller of electronic displays inside a car and analysing drivers' cognitive load by proposing and validated new multimodal fusion algorithms involving eye gaze and finger tracking systems.
Eye Gaze Controlled Projected Display in Automotive and Military Aviation Environments
Four user studies involving driving and flight simulators have found that the proposed projected display can improve driving and flying performance and significantly reduce pointing and selection times for secondary mission control tasks compared to existing interaction systems.
Analysing Ocular Parameters of Users with Cerebral Palsy for Developing Gaze Controlled Interface
Purpose: To investigate response to visual stimuli and visual search patterns of users with cerebral palsy and using that information to facilitate developing eye gaze controlled interaction system.


A new input system for disabled users involving eye gaze tracker and scanning interface
A new input interaction system for people with severe disabilities that combines eye gaze tracking and single switch scanning interaction techniques in a unique way which is faster than only scanning‐based systems while more comfortable to use than only eye gaze Tracking-based systems.
Remote Eye Gaze Tracking System as a Computer Interface for Persons with Severe Motor Disability
The novelty of this study resides in the integration of several procedures, such as: real time improvement of the eye-to-mouse-pointer coordinate conversion mechanism, determination of a practical solution to the mouse click operations, and development of effective means to monitor and evaluate the system performance.
Understanding users and their needs
Findings from gaze-control user trials involving users from both groups: people who are totally paralyzed, as well as people with a wide range of complex disabilities are reported.
Multimodal target prediction model
A Neural Network based model that can be used to predict pointing target for both physical and situational impairment and can accurately predict target in all cases is presented.
Application of Fitts' law to eye gaze interaction
Fitts' Law model was shown to predict movement times using both interaction techniques equally well and is seen to be a potential contributor to design of modern multimodal human-computer interfaces.
Manual and gaze input cascaded (MAGIC) pointing
This work explores a new direction in utilizing eye gaze for computer input by proposing an alternative approach, dubbed MAGIC (Manual And Gaze Input Cascaded) pointing, which might offer many advantages, including reduced physical effort and fatigue as compared to traditional manual pointing, greater accuracy and naturalness than traditional gaze pointing, and possibly fasterspeed than manual pointing.
SideWays: a gaze interface for spontaneous interaction with situated displays
SideWays is presented, a novel person-independent eye gaze interface that supports spontaneous interaction with displays: users can just walk up to a display and immediately interact using their eyes, without any prior user calibration or training.
An evaluation of an eye tracker as a device for computer input2
The results show that an eye tracker can be used as a fast selection device providing that the target size is not too small and if the targets are small speed declines and errors increase rapidly.
Combined head and eye tracking system for dynamic testing of the vestibular system
A combined head-eye tracking system suitable for use with free head movement during natural activities and providing an integrated head and eye position measurement while allowing for a large range of head movement.
Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces
This chapter describes research at NRL on developing interaction techniques that incorporate eye movements into the user-computer dialogue in a convenient and natural way, and considers eye movement-based interaction as an exemplar of a new, more general class of non-command-based user- computer interaction.