Interactive gaze and finger controlled HUD for cars

@article{Prabhakar2019InteractiveGA,
  title={Interactive gaze and finger controlled HUD for cars},
  author={Gowdham Prabhakar and Aparna Nicole Ramakrishnan and Modiksha Madan and L. R. D. Murthy and Vinay Krishna Sharma and Sachin Deshmukh and Pradipta Biswas},
  journal={Journal on Multimodal User Interfaces},
  year={2019},
  volume={14},
  pages={101-121}
}
Modern infotainment systems in automobiles facilitate driving at the cost of secondary tasks in addition to the primary task of driving. These secondary tasks have considerable chance to distract a driver from his primary driving task, thereby reducing safety or increasing cognitive workload. This paper presents an intelligent interactive head up display (HUD) on the windscreen of the driver that does not require them to take eyes off from road while undertaking secondary tasks like playing… 

A Wearable Virtual Touch System for Cars

This paper evaluated the efficacy of the proposed virtual touch system with eye gaze switch inside a real car and investigated acceptance of the system by professional drivers using qualitative research.

A wearable virtual touch system for IVIS in cars

This paper evaluated the efficacy of the proposed virtual touch system with eye gaze switch inside a real car and investigated acceptance of the system by professional drivers using qualitative research.

Efficient Interaction with Automotive Heads-Up Displays using Appearance-based Gaze Tracking

A novel webcam-based gaze tracking system to interact with the icons on HUD is proposed and it is observed that using the proposed eye gaze system, users were able to select icons on the HUD as fast as gesture modality.

Visual Enhancements for the Driver’s Information Search on Automotive Head-up Display

ABSTRACT In the past, in-vehicle head-up displays (HUDs) were used to display simple information including driving speed and the distance between cars. However, recent HUDs now display complex

A Brief Survey on Interactive Automotive UI

Augmented Reality for Future Mobility: Insights from a Literature Review and HCI Workshop

A workshop addressing AR in automotive human-computer interaction (HCI) design was organized, and a number of challenges including human factors issues that need to be tackled, as well as opportunities and practical usages of AR in future mobility were identified.

EYE-GAZE INTERFACE TO OPERATE AIRCRAFT DISPLAYS

This paper presents three approaches of developing eye gaze trackers which are designed to use webcam instead of infrared illumination and are aimed to be functional at high illumination conditions, and presents the intelligent tracker, developed using OpenFace framework, which provides comparable results to COTS eye tracker in terms of interaction speed for both indoor and outdoor conditions.

M[eye]cro

This work presents M[eye]cro an interaction technique to select on-screen objects and navigate menus through the synergistic use of eye-gaze and thumb-to-finger microgestures, and shows that M[ eye]cro induces less fatigue and is mostly preferred.

A Systematic Review of Augmented Reality Applications for Automated Driving: 2009–2020

There is a growing trend toward simulating AR content within virtual driving simulators, and insights into the utilization of AR technology used at different levels of vehicle automation, and for different users and tasks are provided.

A Systematic Review of Virtual Reality Applications for Automated Driving: 2009–2020

Insight is provided into the utilization of VR technology which is applicable at specific level of vehicle automation and for different users (drivers, passengers, pedestrians) and tasks and recommendation for future research in automated driving at the VR side of the reality-virtuality continuum.

References

SHOWING 1-10 OF 40 REFERENCES

Making use of drivers' glances onto the screen for explicit gaze-based interaction

Eye-gaze tracking in combination with a button on the steering wheel as explicit input substituting the interaction on the touch screen is presented, which combines the advantages of direct interaction on visual displays without the drawbacks of touch screens.

Gaze-based interaction on multiple displays in an automotive environment

An unobtrusive and contactless sensor analyzes the driver's eye gaze, which enables the development of gaze driven interaction concepts for operating driver assistance and infotainment systems.

SideWays: a gaze interface for spontaneous interaction with situated displays

SideWays is presented, a novel person-independent eye gaze interface that supports spontaneous interaction with displays: users can just walk up to a display and immediately interact using their eyes, without any prior user calibration or training.

Multimodal Intelligent Eye-Gaze Tracking System

A new target prediction model is developed that works for different input modalities and combined the eye-gaze tracking–based pointing with a joystick controller that can reduce pointing and selection times and the amplitude of maximum power component obtained through Fourier Transform of pupil signal significantly correlates with selection times.

Combining Direct and Indirect Touch Input for Interactive Workspaces using Gaze Input

The results show that the relative gaze augmented selection technique outperforms the other techniques for simple tapping tasks alternating between horizontal and vertical surfaces, and for dragging on the vertical surface, but when tasks involve dragging across surfaces, the findings are more complex.

Combining Direct and Indirect Touch Input for Interactive Desktop Workspaces using Gaze Input

The results show that the relative gaze augmented selection technique outperforms the other techniques for simple tapping tasks alternating between horizontal and vertical surfaces, and for dragging on the vertical surface, however, when tasks involve dragging across surfaces, the findings are more complex.

Estimating Pilots’ Cognitive Load From Ocular Parameters Through Simulation and In-Flight Studies

It is found that ocular parameters like rate of fixation is significantly different in different flying conditions, which can be used for real time estimation of pilots’ cognitive load, providing suitable warnings and alerts to the pilot in cockpit and training of military pilots on cognitive load management during operational missions.

A Multimodal Air Gesture Interface for In Vehicle Menu Navigation

Multimodal and visual-only air gesture systems for navigating menus in the vehicle were developed and compared to a conventional direct touch system in a driving simulator using various distraction

Gaze and Touch Interaction on Tablets

This work proposes gaze and touch input, where touches redirect to the gaze target, and presents a user study comparing this technique to direct-touch, showing that users are slightly slower but can utilise one-handed use with less physical effort.

hMouse: Head Tracking Driven Virtual Computer Mouse

  • Yun FuThomas S. Huang
  • Computer Science
    2007 IEEE Workshop on Applications of Computer Vision (WACV '07)
  • 2007
Experimental results demonstrate that hMouse succeeds under the circumstances of user jumping, extreme movement, large degree rotation, turning around, hand/object occlusion, part face out of camera shooting region, and multiuser Occlusion.