Lee V. Streeter

Learn More
Time of flight cameras produce real-time range maps at a relatively low cost using continuous wave amplitude modulation and demodulation. However, they are geared to measure range (or phase) for a single reflected bounce of light and suffer from systematic errors due to multipath interference. We re-purpose the conventional time of flight device for a new(More)
Time-of-flight range cameras acquire a three-dimensional image of a scene simultaneously for all pixels from a single viewing location. Attempts to use range cameras for metrology applications have been hampered by the multi-path problem, which causes range distortions when stray light interferes with the range measurement in a given pixel. Correcting(More)
Current Time-of-Flight approaches mainly incorporate an continuous wave intensity modulation approach. The phase reconstruction is performed using multiple phase images with different phase shifts which is equivalent to sampling the inherent correlation function at different locations. This active imaging approach delivers a very specific set of influences,(More)
Amplitude-modulated continuous wave (AMCW) time-of-flight (ToF) range imaging cameras measure distance by illuminating the scene with amplitude-modulated light and measuring the phase difference between the transmitted and reflected modulation envelope. This method of optical range measurement suffers from errors caused by multiple propagation paths,(More)
The emergence of commercial time of flight (ToF) cameras for realtime depth images has motivated extensive study of exploitation of ToF information. In principle, a ToF camera is an active sensor that emits an amplitude modulated near-infrared (NIR) signal, which illuminates a given scene. The per-pixel phase difference of the modulation between reflected(More)
Time-of-Flight (ToF) range cameras measure the depth from the camera to the objects in the field of view. This is achieved by illuminating the scene with amplitude modulated light and measuring the phase shift in the modulation envelope between the out going and reflected light. ToF cameras suffer from measurement errors when multiple propagation paths(More)
Full-field range imaging cameras, which are an example of amplitude modulated continuous wave (AMCW) lidar, are subject to multiple error sources. In this paper we consider the possibility that a global jitter and/or drift occurs between the modulation signals to the active illumination source and the camera sensor. A model for jitter is presented and is(More)
Amplitude modulated continuous wave (AMCW) time of flight (ToF) range imaging provides a full field of distance measurement, but common hardware is implemented with digital technology which leads to unwanted harmonic content, a principle source of error in the distance measurements. Existing strategies for correction of harmonics require auxiliary(More)