Model-based motion blur estimation for the improvement of motion tracking
In this paper, we present a novel approach for assessing and interacting with surface tracking algorithms targeting video manipulation in postproduction. As tracking inaccuracies are unavoidable, we enable the user to provide small hints to the algorithms instead of correcting erroneous results afterwards. Based on 2D mesh warp-based optical flow estimation, we visualize results and provide tools for user feedback in a consistent reference system, texture space. In this space, accurate tracking results are reflected by static appearance, and errors can easily be spotted as apparent change. A variety of established tools can be utilized to visualize and assess the change between frames. User interaction to improve tracking results becomes more intuitive in texture space, as it can focus on a small region rather than a moving object. We show how established tools can be implemented for interaction in texture space to provide a more intuitive interface allowing more effective and accurate user feedback.