Kinect range sensing: Structured-light versus Time-of-Flight Kinect

@article{Sarbolandi2015KinectRS,
  title={Kinect range sensing: Structured-light versus Time-of-Flight Kinect},
  author={Hamed Sarbolandi and Damien Lefloch and Andreas Kolb},
  journal={Comput. Vis. Image Underst.},
  year={2015},
  volume={139},
  pages={1-20}
}
This work compares Kinect Structured-Light with Kinect Time-of-Flight cameras.The results offer descriptions under which condition one is superior to the other.Solid insight of the devices is given to make decisions on their application.We propose a set of nine tests for comparing both Kinects, five of which are novel. Recently, the new Kinect One has been issued by Microsoft, providing the next generation of real-time range sensing devices based on the Time-of-Flight (ToF) principle. As the… Expand
Comparative Study of Intel R200, Kinect v2, and Primesense RGB-D Sensors Performance Outdoors
TLDR
Comparing all three sensors for capturing 3D surface data of simple objects outdoors where there’s strong sunlight NIR interference shows the Microsoft Kinect v2 is better than the R200 and the Asus Xtion cannot capture 3D data outdoors. Expand
Depth analysis of kinect v2 sensor in different mediums
TLDR
An analysis of the error in the depth measurement as well as calculation of Depth Entropy given by Kinect v2 sensor in different mediums has been done and the findings from error analysis are used to make an error compensation model which can correct depth at each pixel of the image. Expand
Commodity RGB-D Sensors: Data Acquisition
  • M. Zollhöfer
  • Computer Science
  • RGB-D Image Analysis and Processing
  • 2019
TLDR
This chapter focuses on modern active commodity range sensors based on time of flight and structured light, and discusses the noise characteristics, working ranges, and types of errors made by the different sensing modalities. Expand
An overview of depth cameras and range scanners based on time-of-flight technologies
TLDR
The underlying measurement principles of time-of-flight cameras, including pulsed-light cameras, are described, which measure directly the time taken for a light pulse to travel from the device to the object and back again, and continuous-wave-modulated light cameras, whichMeasure the phase difference between the emitted and received signals, and hence obtain the travel time indirectly. Expand
Evaluation of the Azure Kinect and Its Comparison to Kinect V1 and Kinect V2
TLDR
The new Azure Kinect is thoroughly evaluated; namely its warm-up time, precision, accuracy, accuracy (thoroughly, using a robotic arm), reflectivity, and the multipath and flying pixel phenomenon; and its performance in both indoor and outdoor environments, including direct and indirect sun conditions is validated. Expand
Commodity RGB-D Sensors: Data Acquisition
TLDR
This chapter focuses on modern active commodity range sensors based on time-of-flight and structured light, and discusses the noise characteristics, working ranges, and types of errors made by the different sensing modalities. Expand
RGB-D Sensors Data Quality Assessment and Improvement for Advanced Applications
TLDR
The present chapter provides an overview of RGB-D sensors technology and an analysis of how random and systematic 3D measurement errors affect the global 3D data quality in the various technological implementations. Expand
Real-time processing of range data focusing on environment reconstruction
TLDR
A new method has been designed which uses surface curvature information to robustly reconstruct fine structures of small objects, as well as limiting the total error of camera drift, leading to faster convergence of high-quality reconstructions. Expand
Comparison of Depth Cameras for 3D Reconstruction in Medicine
TLDR
The results showed that both time-of-flight and stereoscopic cameras, using the developed rotating camera rig, provided repeatable body scanning data with minimal operator-induced error, however, the time- of-flight camera generated more accurate 3D point clouds than the stereoscopic sensor. Expand
Miniaturized 3D Depth Sensing-Based Smartphone Light Field Camera
TLDR
A miniaturized 3D depth camera based on a light field camera (LFC) configured with a single aperture and a micro-lens array (MLA) serving as multi-camera systems for 3D surface imaging and demonstrates multi-viewpoint image acquisition via a miniaturization 3D camera module integrated into a smartphone. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 68 REFERENCES
When Can We Use KinectFusion for Ground Truth Acquisition
TLDR
The results suggest that the KinectFusion method is suitable for the fast acquisition of medium scale scenes (a few meters across), filling a gap between structured light and LiDAR scanners. Expand
Enhanced Computer Vision With Microsoft Kinect Sensor: A Review
TLDR
A comprehensive review of recent Kinect-based computer vision algorithms and applications covering topics including preprocessing, object tracking and recognition, human activity analysis, hand gesture analysis, and indoor 3-D mapping. Expand
Study on the use of Microsoft Kinect for robotics applications
The Microsoft X-Box Kinect Sensor is a revolutionary new depth camera that is used in the gaming industry to capture motions of people and players efficiently using the technology of an RGB cameraExpand
Depth Camera Technology Comparison and Performance Evaluation
TLDR
In this paper several depth cameras of different types were put to the test on a variety of tasks in order to judge their respective performance and to find out their weaknesses. Expand
Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications
TLDR
The calibration of the Kinect sensor is discussed, and an analysis of the accuracy and resolution of its depth data is provided, based on a mathematical model of depth measurement from disparity. Expand
A State of the Art Report on Kinect Sensor Setups in Computer Vision
TLDR
This report gives a comprehensive overview over the main publications using the Microsoft Kinect out of its original context as a decision-forest based motion-capturing tool. Expand
Time-of-Flight sensor calibration for accurate range sensing
TLDR
A new intensity-based calibration model is proposed that requires less input data compared to other models and thus significantly contributes to the reduction of calibration data and therefore significantly reduces the number of necessary reference images. Expand
Markerless Motion Capture using multiple Color-Depth Sensors
TLDR
This work systematically evaluates the concurrent use of one to four Kinects, including calibration, error measures and analysis, and presents a time-multiplexing approach on reducing or mitigating the detrimental effects of multiple active light emitters, thereby allowing motion capture from all angles. Expand
Shake'n'sense: reducing interference for overlapping structured light depth cameras
TLDR
A novel yet simple technique that mitigates the interference caused when multiple structured light depth cameras point at the same part of a scene, particularly useful for Kinect, where the structured light source is not modulated. Expand
KinectFusion: Real-time dense surface mapping and tracking
We present a system for accurate real-time mapping of complex and arbitrary indoor scenes in variable lighting conditions, using only a moving low-cost depth camera and commodity graphics hardware.Expand
...
1
2
3
4
5
...