Fenchel Duality Theory and a Primal-Dual Algorithm on Riemannian Manifolds

@article{Bergmann2021FenchelDT,
  title={Fenchel Duality Theory and a Primal-Dual Algorithm on Riemannian Manifolds},
  author={Ronny Bergmann and Roland Herzog and M. S. Louzeiro and Daniel Tenbrinck and Jos{\'e} Vidal-N{\'u}{\~n}ez},
  journal={Found. Comput. Math.},
  year={2021},
  volume={21},
  pages={1465-1504}
}
This paper introduces a new notion of a Fenchel conjugate, which generalizes the classical Fenchel conjugation to functions defined on Riemannian manifolds. We investigate its properties, e.g., the Fenchel–Young inequality and the characterization of the convex subdifferential using the analogue of the Fenchel–Moreau Theorem. These properties of the Fenchel conjugate are employed to derive a Riemannian primal-dual optimization algorithm and to prove its convergence for the case of Hadamard… 
Fenchel Duality and a Separation Theorem on Hadamard Manifolds
In this paper, we introduce a definition of Fenchel conjugate and Fenchel biconjugate on Hadamard manifolds based on the tangent bundle. Our definition overcomes the inconvenience that the conjugate
Duality-based Higher-order Non-smooth Optimization on Manifolds
We propose a method for solving non-smooth optimization problems on manifolds. In order to obtain superlinear convergence, we apply a Riemannian Semi-smooth Newton method to a non-smooth non-linear
First-Order Primal–Dual Methods for Nonsmooth Non-convex Optimisation
We provide an overview of primal-dual algorithms for nonsmooth and non-convex-concave saddle-point problems. This flows around a new analysis of such methods, using Bregman divergences to formulate
Total variation of the normal vector field as shape prior
An analogue of the total variation prior for the normal vector field along the boundary of smooth shapes in 3D is introduced. The analysis of the total variation of the normal vector field is based
Manopt.jl: Optimization on Manifolds in Julia
TLDR
Algorithms include the derivative-free Particle Swarm and Nelder–Mead algorithms, as well as classical gradient, conjugate gradient and stochastic gradient descent.
The difference of convex algorithm on Riemannian manifolds
In this paper we propose a Riemannian version of the difference of convex algorithm (DCA) to solve a minimization problem involving the difference of convex (DC) function. Equivalence between the
An Inexact Semismooth Newton Method on Riemannian Manifolds with Application to Duality-Based Total Variation Denoising
TLDR
An inexact version of the Riemannian Semismooth Newton method is proposed and conditions for local linear and superlinear convergence that hold independent of the sign of the curvature are proved.

References

SHOWING 1-10 OF 65 REFERENCES
Proximal Point Algorithm On Riemannian Manifolds
Abstract In this paper we consider the minimization problem with constraints. We will show that if the set of constraints is a Riemannian manifold of nonpositive sectional curvature, and the
Local Convex Conjugacy and Fenchel Duality
Riemannian Geometry
THE recent physical interpretation of intrinsic differential geometry of spaces has stimulated the study of this subject. Riemann proposed the generalisation, to spaces of any order, of Gauss's
Convex Functions and Optimization Methods on Riemannian Manifolds
Preface. 1. Metric properties of Riemannian manifolds. 2. First and second variations of the p-energy of a curve. 3. Convex functions on Riemannian manifolds. 4. Geometric examples of convex
Subgradient Algorithm on Riemannian Manifolds
The subgradient method is generalized to the context of Riemannian manifolds. The motivation can be seen in non-Euclidean metrics that occur in interior-point methods. In that frame, the natural
Gradient Method for Optimization on Riemannian Manifolds with Lower Bounded Curvature
TLDR
The gradient method for minimize a differentiable convex function on Riemannian manifolds with lower bounded sectional curvature with Lipschitz continuous gradient is analyzed and the adaptive stepsize is a promising scheme that is worth considering.
Smooth Nonlinear Optimization in Rn
Preface. 1. Introduction. 2. Nonlinear Optimization Problems. 3. Optimality Conditions. 4. Geometric Background of Optimality Conditions. 5. Deduction of the Classical Optimality Conditions in
Introduction to Smooth Manifolds
Preface.- 1 Smooth Manifolds.- 2 Smooth Maps.- 3 Tangent Vectors.- 4 Submersions, Immersions, and Embeddings.- 5 Submanifolds.- 6 Sard's Theorem.- 7 Lie Groups.- 8 Vector Fields.- 9 Integral Curves
...
...