Intent Inference for Hand Pointing Gesture-Based Interactions in Vehicles

Abstract

Using interactive displays, such as a touchscreen, in vehicles typically requires dedicating a considerable amount of visual as well as cognitive capacity and undertaking a hand pointing gesture to select the intended item on the interface. This can act as a distractor from the primary task of driving and consequently can have serious safety implications. Due to road and driving conditions, the user input can also be highly perturbed resulting in erroneous selections compromising the system usability. In this paper, we propose intent-aware displays that utilize a pointing gesture tracker in conjunction with suitable Bayesian destination inference algorithms to determine the item the user intends to select, which can be achieved with high confidence remarkably early in the pointing gesture. This can drastically reduce the time and effort required to successfully complete an in-vehicle selection task. In the proposed probabilistic inference framework, the likelihood of all the nominal destinations is sequentially calculated by modeling the hand pointing gesture movements as a destination-reverting process. This leads to a Kalman filter-type implementation of the prediction routine that requires minimal parameter training and has low computational burden; it is also amenable to parallelization. The substantial gains obtained using an intent-aware display are demonstrated using data collected in an instrumented vehicle driven under various road conditions.

DOI: 10.1109/TCYB.2015.2417053

Cite this paper

@article{Ahmad2016IntentIF, title={Intent Inference for Hand Pointing Gesture-Based Interactions in Vehicles}, author={Bashar I. Ahmad and James K. Murphy and Patrick Langdon and Simon J. Godsill and Robert Hardy and Lee Skrypchuk}, journal={IEEE transactions on cybernetics}, year={2016}, volume={46 4}, pages={878-89} }