Omnidirectional Egomotion Estimation From Back-projection Flow

@article{Shakernia2003OmnidirectionalEE,
  title={Omnidirectional Egomotion Estimation From Back-projection Flow},
  author={Omid Shakernia and Ren{\'e} Vidal and S. Shankar Sastry},
  journal={2003 Conference on Computer Vision and Pattern Recognition Workshop},
  year={2003},
  volume={7},
  pages={82-82}
}
The current state-of-the-art for egomotion estimation with omnidirectional cameras is to map the optical flow to the sphere and then apply egomotion algorithms for spherical projection. In this paper, we propose to back-project image points to a virtual curved retina that is intrinsic to the geometry of the central panoramic camera, and compute the optical flow on this retina: the so-called back-projection flow. We show that well-known egomotion algorithms can be easily adapted to work with the… 
Adapted Approach for Omnidirectional Egomotion Estimation
TLDR
This paper adapts motion field calculated using adapted method which takes into account the distortions existing in the omnidirectional image, and comparison of error measures are given to confirm that succeeded estimation of camera motion will be obtained when using an adapted method to estimate optical flow.
Egomotion estimation with large field-of-view vision
TLDR
This thesis investigates the problem of egomotion estimation in a monocular, large Field-of-View (FOV) camera from two views of the scene by introducing the antipodal-epipolar constraints on relative camera motion.
Omnidirectional Egomotion Estimation from Adapted Motion Field
TLDR
This paper presents a method for the recovering of egomotion using an adapted motion field based on a motion model and adapted neighborhood described for parabolic mirror, and shows that the use of anadapted motion fields improve the estimation of the observer motion.
Optical Flow Field Segmentation in an Omnidirectional Camera Image Based on Known Camera Motion
We consider the problem of optical flow analysis in an omnidirectional camera image, in order to segment out dynamic objects from ego-motion caused optical flow. The aim is to use the camera on a
Optical flow field segmentation in an omnidirectional camera image based on known camera motion
TLDR
This work proposes to geometrically map the optical flow vectors on a sphere, centered at the center of the central panoramic camera projection, since it observed the space angle to be invariant to the errors of the planar space assumption.
Structure from Small Baseline Motion with Central Panoramic Cameras
TLDR
This paper derives a multi-frame structure from motion algorithm for calibrated central panoramic cameras that is linear, amenable to real-time implementation, and performs well in the small baseline domain for which it is designed.
Motion estimation by decoupling rotation and translation in catadioptric vision
TLDR
This paper presents a method for estimating six degrees of freedom camera motions from central catadioptric images in man-made environments by decoupling the rotation and the translation and shows that the line-based approach allows to estimate the absolute attitude at each frame, without error accumulation.
Parametric ego-motion estimation for vehicle surround analysis using an omnidirectional camera
TLDR
The application of parametric ego-motion estimation for vehicle detection to perform surround analysis using an automobile-mounted camera using the parametric planar motion model to compensate distortion in omnidirectional images.
Detection of Moving Objects with a Mobile Omni-directional Camera
The omni-directional camera has a full range in all directions, which gains the complete field of view. In the past, a moving object can be detected, only when the camera is static or moving with a
Egomotion Estimation and Reconstruction with Kalman Filters and GPS Integration
TLDR
This paper presents an approach for egomotion estimation over stereo image sequences combined with extra GPS data, and proposes a novel technique that uses a set of augmented and selected keypoints, which are carefully tracked by a Kalman filter fusion.
...
1
2
3
4
...

References

SHOWING 1-10 OF 19 REFERENCES
Ego-motion and omnidirectional cameras
  • J. Gluckman, S. Nayar
  • Computer Science
    Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271)
  • 1998
TLDR
This paper proposes mapping the image velocity vectors to a sphere, using the Jacobian of the transformation between the projection model of the camera and spherical projection, and demonstrates the ability to compute ego-motion with omnidirectional cameras.
A general approach for egomotion estimation with omnidirectional images
Computing a camera's ego-motion from an image sequence is easier to accomplish when a spherical retina is used, as opposed to a standard retinal plane. On a spherical field of view both the focus of
Comparison of approaches to egomotion computation
  • T. Tian, Carlo Tomasi, D. Heeger
  • Mathematics, Computer Science
    Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition
  • 1996
TLDR
It is found that the bias and sensitivity of the six algorithms evaluated are totally invariant with respect to the axis of rotation, and it is widely believed that increasing the field of view will yield better performance but this is not necessarily true.
Subspace methods for recovering rigid motion I: Algorithm and implementation
TLDR
This article shows that the nonlinear equation describing the optical flow field can be split by an exact algebraic manipulation to form three sets of equations, and shows that depth and rotation need not be known or estimated prior to solving for translation.
Infinitesimal motion estimation from multiple central panoramic views
We present an algorithm for infinitesimal motion estimation from multiple central panoramic views. We first derive the optical flow equations for central panoramic cameras as a function of both pixel
Multibody motion estimation and segmentation from multiple central panoramic views
TLDR
A factorization-based technique is proposed that estimates the number of independent motions, the segmentation of the image measurements and the motion of each object relative to the camera from a set of image points and their optical flows in multiple frames.
Passive navigation
TLDR
This paper derives a set of nine non-linear equations using a least-squares formulation that allows for the recovery of the motion of an observer relative to a planar surface directly from image brightness derivatives.
Omni-directional vision for robot navigation
We describe a method for visual based robot navigation with a single omni-directional (catadioptic) camera. We show how omni-directional images can be used to generate the representations needed for
Linear Differential Algorithm for Motion Recovery: A Geometric Approach
TLDR
A precise characterization of the space of differential essential matrices gives rise to a novel eigenvalue-decomposition-based 3D velocity estimation algorithm from the optical flow measurements that gives a unique solution to the motion estimation problem and serves as a differential counterpart of the well-known SVD- based 3D displacement estimation algorithm for the discrete case.
Image processing in catadioptric planes: spatiotemporal derivatives and optical flow computation
Images produced by catadioptric sensors contain a significant amount of radial distortion and variation in inherent scale. Blind application of conventional shift-invariant operators or optical flow
...
1
2
...