An Introduction to Inertial and Visual Sensing

@article{Corke2007AnIT,
  title={An Introduction to Inertial and Visual Sensing},
  author={Peter Corke and Jorge Lobo and J. Dias},
  journal={The International Journal of Robotics Research},
  year={2007},
  volume={26},
  pages={519 - 535}
}
In this paper we present a tutorial introduction to two important senses for biological and robotic systems — inertial and visual perception. We discuss the fundamentals of these two sensing modalities from a biological and an engineering perspective. Digital camera chips and micro-machined accelerometers and gyroscopes are now commodities, and when combined with today's available computing can provide robust estimates of self-motion as well 3D scene structure, without external infrastructure… 
An Introduction to Inertial and Visual Sensing an Introduction to Inertial and Visual Sensing
In this paper we present a tutorial introduction to two important senses for biological and robotic systems — inertial and visual perception. We discuss the fundamentals of these two sensing
A new calibration method for an inertial and visual sensing system
TLDR
The relative pose between inertial and visual sensors equipped in autonomous robots is calibrated in two steps and the translation parameters in the relative pose are obtained with at least two corresponding points in the two images captured before and after one step motion.
Modeling and Calibration of Inertial and Vision Sensors
TLDR
A new algorithm for estimating the relative translation and orientation of an inertial measurement unit and a camera, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it, which works well in practice, both for perspective and spherical cameras.
Robust real-time tracking by fusing measurements from inertial and vision sensors
TLDR
A real-time implementation of a multi-rate extended Kalman filter is described, using a dynamic model with 22 states, where 12.5 Hz correspondences from vision and 100 Hz inertial measurements are processed, and shows an absolute accuracy of 2 cm in position and 1° in orientation.
Relative Pose Calibration Between Visual and Inertial Sensors
This paper proposes an approach to calibrate off-the-shelf cameras and inertial sensors to have a useful integrated system to be used in static and dynamic situations. When both sensors are
A Framework for Real-Time Scene Modelling based on Visual-Inertial Cues
TLDR
A novel framework for visual-inertial scene re- construction (VISrec!) is suggested based on ideas from multi-sensor data fusion (MSDF), which is useful to compensate typical problems of systems which rely only on visual information.
Fusion of Vision and Inertial Sensors for Position-Based Visual Servoing of a Robot Manipulator
TLDR
A new technique to combine measurements from both a vision system and an inertial sensor mounted on a robot tip when the robot moves towards an object is presented.
Accurate Human Navigation Using Wearable Monocular Visual and Inertial Sensors
TLDR
A novel visual-inertial integration system for human navigation in free-living environments, where the measurements from wearable inertial and monocular visual sensors are integrated, and an adaptive-frame rate single camera is selected to not only avoid motion blur based on the angular velocity and acceleration after compensation, but also to make an effect called visual zero-velocity update for the static motion.
Sensing and Control on the Sphere
TLDR
The sphere is proposed as a unifying concept, not just for cameras, but for sensor fusion, estimation and control, and illustrated with relevant simulation examples for spherical visual servoing and scene structure estimation.
Extended Kalman Filter-Based Methods for Pose Estimation Using Visual, Inertial and Magnetic Sensors: Comparative Analysis and Performance Evaluation
TLDR
Measurements from a monocular vision system are fused with inertial/magnetic measurements from an Inertial Measurement Unit (IMU) rigidly connected to the camera to estimate the pose of the IMU/camera sensor moving relative to a rigid scene (ego-motion).
...
...

References

SHOWING 1-10 OF 145 REFERENCES
An Introduction to Inertial and Visual Sensing an Introduction to Inertial and Visual Sensing
In this paper we present a tutorial introduction to two important senses for biological and robotic systems — inertial and visual perception. We discuss the fundamentals of these two sensing
Vision and Inertial Sensor Cooperation Using Gravity as a Vertical Reference
  • J. Lobo, J. Dias
  • Computer Science
    IEEE Trans. Pattern Anal. Mach. Intell.
  • 2003
TLDR
A framework for using inertial sensor data in vision systems is set, some results obtained, and the unit sphere projection camera model is used, providing a simple model for inertial data integration.
Tightly integrated sensor fusion for robust visual tracking
Cooperation of the inertial and visual systems
This paper introduces a number of issues concerning the use of an inertial system in cooperation with vision. We first present applications of inertial information in a visual system, and then attack
Relative Pose Calibration Between Visual and Inertial Sensors
This paper proposes an approach to calibrate off-the-shelf cameras and inertial sensors to have a useful integrated system to be used in static and dynamic situations. When both sensors are
Inertial Sensed Ego-motion for 3D Vision
TLDR
The dynamic inertial cues provided by the vestibular system provide an image independent location of the image focus of expansion and center of rotation useful during visual based navigation tasks, allowing a fast depth map registration method.
Relative position sensing by fusing monocular vision and inertial rate sensors
This dissertation describes the development of a new, robust, relative-position sensing strategy suitable for unstructured and unprepared environments. Underwater manipulation is the particular
Fusion of Vision and Inertial Data for Motion and Structure Estimation
This paper presents a method to fuse measurements from a rigid sensor rig with a stereo vision system and a set of 6 DOF inertial sensors for egomotion estimation and external structure estimation.
Calibration of Hybrid Vision / Inertial Tracking Systems *
TLDR
A new method for 3-axis inertial sensor calibration based on model fitting and a method to find the rotation between vision and inertial system based on rotation differences are presented, achieving a coordinate system rotation mismatch of < 1°.
Motion Estimation from Image and Inertial Measurements
TLDR
This paper presents two algorithms for estimating sensor motion from image and inertial measurements, and presents a batch method, which produces estimates of the sensor motion, scene structure, and other unknowns using measurements from the entire observation sequence simultaneously.
...
...