Advanced tracking through efficient image processing and visual-inertial sensor fusion

@article{Bleser2008AdvancedTT,
  title={Advanced tracking through efficient image processing and visual-inertial sensor fusion},
  author={Gabriele Bleser and Didier Stricker},
  journal={2008 IEEE Virtual Reality Conference},
  year={2008},
  pages={137-144}
}

Figures and Tables from this paper

Using the marginalised particle filter for real-time visual-inertial sensor fusion
  • G. Bleser, D. Stricker
  • Engineering
    2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
  • 2008
TLDR
This paper addresses the question of how to use measurements from low-cost inertial sensors (gyroscopes and accelerometers) to compensate for the missing control information and develops a real-time capable sensor fusion strategy based upon the marginalised particle filter (MPF) framework.
Hybrid marker-less camera pose tracking with integrated sensor fusion
TLDR
This paper proposes an algorithm for visual-inertial camera pose tracking, using adaptive recursive particle filtering, which benefits from the agility of inertial-based and robustness of vision-based tracking and implements an intelligent decision making process.
Visual-Inertial Sensor Fusion for Tracking in Ambulatory Environments
TLDR
The approach taken in this thesis is a ground-up approach towards a complete visual-inertial system: from camera calibration all the way to handling of asynchronous sensor measurements for sensor-fusion.
Institutionen För Systemteknik Visual-inertial Tracking Using Optical Flow Measurements Visual-inertial Tracking Using Optical Flow Measurements
Visual-inertial tracking is a well known technique to track a combination of a camera and an inertial measurement unit (IMU). An issue with the straightforward approach is the need of known 3D
Extended sensor fusion for embedded video applications
TLDR
A novel approach to real-time hybrid monocular visual-inertial odometry for embedded platforms is introduced and it is shown that the method outperforms classical hybrid techniques in ego-motion estimation.
Fusing Inertial Sensor Data in an Extended Kalman Filter for 3D Camera Tracking
TLDR
This paper provides an extensive performance comparison of every possible combination of fusing accelerometer and gyroscope data as control or measurement inputs using the same data set collected at different motion speeds and shows that it is always better to fuse both sensors in the measurement stage.
Extended Kalman Filter-Based Methods for Pose Estimation Using Visual, Inertial and Magnetic Sensors: Comparative Analysis and Performance Evaluation
TLDR
Measurements from a monocular vision system are fused with inertial/magnetic measurements from an Inertial Measurement Unit (IMU) rigidly connected to the camera to estimate the pose of the IMU/camera sensor moving relative to a rigid scene (ego-motion).
Pose Estimation of a Mobile Robot Based on Fusion of IMU Data and Vision Data Using an Extended Kalman Filter
TLDR
Experimental results show that the proposed fusion of an inertial sensor of six degrees of freedom, and a vision to determine a low-cost and accurate position for an autonomous mobile robot is fast in computation, reliable and robust, and can be considered for practical applications.
Benchmarking Inertial Sensor-Aided Localization and Tracking Methods
TLDR
This paper investigates means to benchmark methods for camera pose localization and tracking that in addition to a camera image make use of inertial sensor measurements, and synthesizing gravity measurements from these poses achieves similar results to using real sensor measurements at significantly less effort.
Inertial sensor fusion for 3D camera tracking
TLDR
It is shown that it is better use a gyroscope as a control input while an accelerometer can be used as a measurement or control input when using inertial sensors for 3D tracking.
...
...

References

SHOWING 1-10 OF 48 REFERENCES
Going out: robust model-based tracking for outdoor augmented reality
TLDR
A model-based hybrid tracking system for outdoor augmented reality in urban environments enabling accurate, realtime overlays for a handheld device and the accuracy and robustness of the resulting system is demonstrated with comparisons to map-based ground truth data.
Fusion of vision and gyro tracking for robust augmented reality registration
  • Suya You, U. Neumann
  • Engineering, Computer Science
    Proceedings IEEE Virtual Reality 2001
  • 2001
TLDR
A novel framework enables accurate augmented reality (AR) registration with integrated inertial gyroscope and vision tracking technologies that combines the low-frequency stability of vision sensors with the high-frequency tracking of Gyroscope sensors, hence achieving stable static and dynamic six-degree-of-freedom pose tracking.
Adaptable Model-Based Tracking Using Analysis-by-Synthesis Techniques
TLDR
A novel analysis-by-synthesis approach for real-time camera tracking in industrial scenarios based on the tracking of line features which are generated dynamically in every frame by rendering a polygonal model and extracting contours out of the rendered scene.
Online camera pose estimation in partially known and dynamic scenes
TLDR
A robust approach to real-time camera pose estimation is presented, which does neither depend on offline pre-processing steps nor on pre-knowledge of the entire target scene, and is validated on synthetic as well as on real video sequences.
Fusing points and lines for high performance tracking
  • E. Rosten, T. Drummond
  • Computer Science
    Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1
  • 2005
TLDR
This paper presents a method for integrating the two systems and robustly combining the pose estimates they produce, and shows how on-line learning can be used to improve the performance of feature tracking.
Handling uncertain sensor data in vision-based camera tracking
TLDR
A hybrid approach for real-time markerless tracking that combines the coupling of camera and inertial sensor data and addresses the camera/sensor synchronization problem and proposes a method to resynchronize these two devices online.
VIS-Tracker: a wearable vision-inertial self-tracker
TLDR
This work presents a demonstrated and commercially viable self-tracker, using robust software that fuses data from inertial and vision sensors, and presents a roadmap for how the system will migrate from artificial fiducials to natural ones.
Adaptive line tracking with multiple hypotheses for augmented reality
TLDR
A real-time model-based line tracking approach with adaptive learning of image edge features that can handle partial occlusion and illumination changes and evaluated the algorithm and showed the improvements compared to other tracking approaches.
Robust structure from motion estimation using inertial data.
  • G. Qian, R. Chellappa, Q. Zheng
  • Computer Science, Mathematics
    Journal of the Optical Society of America. A, Optics, image science, and vision
  • 2001
TLDR
It is shown how inertial data can be used for improved noise resistance, reduction of inherent ambiguities, and handling of mixed-domain sequences, and the number of feature points needed for accurate and robust SfM estimation can be significantly reduced when inertialData are employed.
A framework for simultaneous localization and mapping utilizing model structure
TLDR
An algorithm is introduced, which merges FastSLAM and MPF, and the result is an MPF algorithm for slam applications, where state vectors of higher dimensions can be used.
...
...