A primal-dual framework for real-time dense RGB-D scene flow


This paper presents the first method to compute dense scene flow in real-time for RGB-D cameras. It is based on a variational formulation where brightness constancy and geometric consistency are imposed. Accounting for the depth data provided by RGB-D cameras, regularization of the flow field is imposed on the 3D surface (or set of surfaces) of the observed scene instead of on the image plane, leading to more geometrically consistent results. The minimization problem is efficiently solved by a primal-dual algorithm which is implemented on a GPU, achieving a previously unseen temporal performance. Several tests have been conducted to compare our approach with a state-of-the-art work (RGB-D flow) where quantitative and qualitative results are evaluated. Moreover, an additional set of experiments have been carried out to show the applicability of our work to estimate motion in real-time. Results demonstrate the accuracy of our approach, which outperforms the RGB-D flow, and which is able to estimate heterogeneous and non-rigid motions at a high frame rate.

DOI: 10.1109/ICRA.2015.7138986

Extracted Key Phrases

6 Figures and Tables

Citations per Year

Citation Velocity: 9

Averaging 9 citations per year over the last 3 years.

Learn more about how we calculate this metric in our FAQ.

Cite this paper

@article{Jaimez2015APF, title={A primal-dual framework for real-time dense RGB-D scene flow}, author={Mariano Jaimez and Mohamed Souiai and Javier Gonz{\'a}lez and Daniel Cremers}, journal={2015 IEEE International Conference on Robotics and Automation (ICRA)}, year={2015}, pages={98-104} }