Optimization On Manifolds: Methods And Applications

@inproceedings{Absil2010OptimizationOM,
  title={Optimization On Manifolds: Methods And Applications},
  author={Pierre-Antoine Absil and Robert E. Mahony and Rodolphe Sepulchre},
  year={2010}
}
This paper provides an introduction to the topic of optimization on manifolds. The approach taken uses the language of differential geometry, however,we choose to emphasise the intuition of the concepts and the structures that are important in generating practical numerical algorithms rather than the technical details of the formulation. There are a number of algorithms that can be applied to solve such problems and we discuss the steepest descent and Newton’s method in some detail as well as… 

Figures from this paper

A New Approach to the Proximal Point Method: Convergence on General Riemannian Manifolds
TLDR
Without requiring any restrictive assumptions about the sign of the sectional curvature of the manifold, full convergence is obtained for any bounded sequence generated by the proximal point method, in the case that the objective function satisfies the Kurdyka–Lojasiewicz inequality.
Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
We extend the scope of analysis for linesearch optimization algorithms on (possibly infinite-dimensional) Riemannian manifolds to the convergence analysis of the BFGS quasi-Newton scheme and the
Accelerated Algorithms for Convex and Non-Convex Optimization on Manifolds
TLDR
The proposed method unifies insights from Nesterov's original idea for accelerating gradient descent algorithms with recent developments in optimization algorithms in Euclidean space and shows that when the objective function is convex, the algorithm provably converges to the optimum and leads to accelerated convergence.
Proximal Point Methods for Lipschitz Functions on Hadamard Manifolds: Scalar and Vectorial Cases
We study the convergence of exact and inexact versions of the proximal point method with a generalized regularization function in Hadamard manifolds for solving scalar and vectorial optimization
An Efficient BFGS Algorithm for Riemannian Optimization
TLDR
A convergence result for Riemannian line-search methods that ensures superlinear convergence is presented and a theory of building vector transports on submanifolds of R n is presented.
Towards optimization techniques on diffeological spaces by generalizing Riemannian concepts
TLDR
A suitable definition of a tangent space in view to optimization methods is presented and a diffeological Riemannian space and a Diffeological gradient, which are needed in an optimization algorithm on diffeology spaces are presented.
Projection-like Retractions on Matrix Manifolds
TLDR
This theory offers a framework in which previously proposed retractions can be analyzed, as well as a toolbox for constructing new ones, for submanifolds of Euclidean spaces.
Piecewise rigid curve deformation via a Finsler steepest descent
This paper introduces a novel steepest descent flow in Banach spaces. This extends previous works on generalized gradient descent, notably the work of Charpiat et al., to the setting of Finsler
Curvature and Torsion estimation of 3D functional data: A geometric approach to build the mean shape under the Frenet Serret framework
TLDR
This work develops an alternative characterization of a mean that reflects shape variation of the curves and introduces a new definition of mean curvature and mean torsion, as well as mean shape through the notion of mean vector field.
A Variational Approach to Registration with Local Exponential Coordinates
We identify a novel parameterization for the group of finite rotations (SO3), consisting of an atlas of exponential maps defined over local tangent planes, for the purpose of computing isometric
...
1
2
3
4
...

References

SHOWING 1-10 OF 111 REFERENCES
The Geometry of Algorithms with Orthogonality Constraints
TLDR
The theory proposed here provides a taxonomy for numerical linear algebra algorithms that provide a top level mathematical view of previously unrelated algorithms and developers of new algorithms and perturbation theories will benefit from the theory.
The Geometry of the Newton Method on Non-Compact Lie Groups
TLDR
The invariant structure of a Lie group is exploited to provide a strong interpretation of a Newton iteration on a general Lie group and local asymptotic quadratic convergence is proved for the algorithms considered.
Geometrical interpretation of the predictor-corrector type algorithms in structured optimization problems
TLDR
This article develops sufficient conditions for quadratic convergence of predictor-corrector methods using a proximal point correction step, and argues that returning in this manner is preferable to returning via the projection mapping.
Minimizing a differentiable function over a differential manifold
To generalize the descent methods of unconstrained optimization to the constrained case, we define intrinsically the gradient field of the objective function on the constraint manifold and analyze
Newton methods for nonsmooth convex minimization: connections among -Lagrangian, Riemannian Newton and SQP methods
TLDR
Newton-type methods for minimization of partly smooth convex functions using Sequential Newton methods using local parameterizations obtained from -Lagrangian theory and from Riemannian geometry are studied.
Optimization Criteria and Geometric Algorithms for Motion and Structure Estimation
Prevailing efforts to study the standard formulation of motion and structure recovery have recently been focused on issues of sensitivity and robustness of existing techniques. While many cogent
Learning algorithms utilizing quasi-geodesic flows on the Stiefel manifold
Riemannian subspace tracking algorithms on Grassmann manifolds
  • M. Baumann, U. Helmke
  • Mathematics, Computer Science
    2007 46th IEEE Conference on Decision and Control
  • 2007
TLDR
A new class of Newton- type algorithms for adaptively computing the principal and minor subspaces of a time-varying family of symmetric matrices using local parameterization of the Grassmann manifold is proposed.
On VU-theory for Functions with Primal-Dual Gradient Structure
TLDR
A space decomposition that allows us to identify a subspace on which the function appears to be smooth is discussed, using the special structure of such a function, to compute smooth trajectories along which certain second-order expansions can be obtained.
...
1
2
3
4
5
...