Design recommendations for camera-based head-controlled interfaces that replace the mouse for motion-impaired users
This paper presents a real-time, non-invasive, vision system which performs hands-free cursor control by having users point their nose where they wish to place the cursor on a monitor screen. The vision system robustly tracks 3D face position and orientation in real time using a framework called Incremental Focus of Attention (IFA). IFA integrates tracking based on multiple cues (including color, intensity templates, and dark point features) which cooperate to track under adverse visual conditions. The pose recovered from tracking is then used to compute the intersection between the plane of the monitor screen and an imaginary ray extending forward from the user’s nose. Results show that naive users can position a cursor within a 1cm x 1cm square from a distance of 50cm from the monitor.