Learn More
Time of flight cameras produce real-time range maps at a relatively low cost using continuous wave amplitude modulation and demodulation. However, they are geared to measure range (or phase) for a single reflected bounce of light and suffer from systematic errors due to multipath interference. We re-purpose the conventional time of flight device for a new(More)
Current Time-of-Flight approaches mainly incorporate an continuous wave intensity modulation approach. The phase reconstruction is performed using multiple phase images with different phase shifts which is equivalent to sampling the inherent correlation function at different locations. This active imaging approach delivers a very specific set of influences,(More)
The emergence of commercial time of flight (ToF) cameras for realtime depth images has motivated extensive study of exploitation of ToF information. In principle, a ToF camera is an active sensor that emits an amplitude modulated near-infrared (NIR) signal, which illuminates a given scene. The per-pixel phase difference of the modulation between reflected(More)
Full-field range imaging cameras, which are an example of amplitude modulated continuous wave (AMCW) lidar, are subject to multiple error sources. In this paper we consider the possibility that a global jitter and/or drift occurs between the modulation signals to the active illumination source and the camera sensor. A model for jitter is presented and is(More)
Optical flow is the vector inverse problem of estimation motion though an image sequence. Geometric Algebra is an appropriate mathematical language for describing and solving vector problems. In this paper we apply Geometric Algebra to optical flow and pose a direct solution using a simple smoothness constraint. Implementational considerations are given and(More)
The design of an audible sonar distributed sensor time-of-flight range imaging system is investigated, sonar being chosen as a substitute for optical range imaging due to cost and simplicity of implementation. The distributed range imaging system proposed is based on the holographic principle where the sensors detect the self interference of the reflected(More)
Time-of-Flight (ToF) range cameras measure the depth from the camera to the objects in the field of view. This is achieved by illuminating the scene with amplitude modulated light and measuring the phase shift in the modulation envelope between the out going and reflected light. ToF cameras suffer from measurement errors when multiple propagation paths(More)