Fenchel Duality and a Separation Theorem on Hadamard Manifolds

@article{Bergmann2022FenchelDA,
  title={Fenchel Duality and a Separation Theorem on Hadamard Manifolds},
  author={Ronny Bergmann and Roland Herzog and M. S. Louzeiro},
  journal={SIAM J. Optim.},
  year={2022},
  volume={32},
  pages={854-873}
}
In this paper, we introduce a definition of Fenchel conjugate and Fenchel biconjugate on Hadamard manifolds based on the tangent bundle. Our definition overcomes the inconvenience that the conjugate depends on the choice of a certain point on the manifold, as previous definitions required. On the other hand, this new definition still possesses properties known to hold in the Euclidean case. It even yields a broader interpretation of the Fenchel conjugate in the Euclidean case itself. Most… 
1 Citations
The difference of convex algorithm on Riemannian manifolds
In this paper we propose a Riemannian version of the difference of convex algorithm (DCA) to solve a minimization problem involving the difference of convex (DC) function. Equivalence between the

References

SHOWING 1-10 OF 57 REFERENCES
Fenchel Duality Theory and a Primal-Dual Algorithm on Riemannian Manifolds
This paper introduces a new notion of a Fenchel conjugate, which generalizes the classical Fenchel conjugation to functions defined on Riemannian manifolds. We investigate its properties, e.g., the
Proximal Point Algorithm On Riemannian Manifolds
Abstract In this paper we consider the minimization problem with constraints. We will show that if the set of constraints is a Riemannian manifold of nonpositive sectional curvature, and the
Sampling and Optimization on Convex Sets in Riemannian Manifolds of Non-Negative Curvature
TLDR
To the knowledge, these are the first algorithms in the general setting of positively curved manifolds with provable polynomial guarantees under reasonable assumptions, and the first study of the connection between sampling and optimization in this setting.
Operator scaling via geodesically convex optimization, invariant theory and polynomial identity testing
TLDR
A new second-order method for geodesically convex optimization on the natural hyperbolic metric over positive definite matrices is proposed, which yields a deterministic polynomial-time algorithm for a new class of Polynomial Identity Testing problems, which was the original motivation for studying operator scaling.
Functional Analysis, Sobolev Spaces and Partial Differential Equations
Preface.- 1. The Hahn-Banach Theorems. Introduction to the Theory of Conjugate Convex Functions.- 2. The Uniform Boundedness Principle and the Closed Graph Theorem. Unbounded Operators. Adjoint.
Introduction to Smooth Manifolds
Preface.- 1 Smooth Manifolds.- 2 Smooth Maps.- 3 Tangent Vectors.- 4 Submersions, Immersions, and Embeddings.- 5 Submanifolds.- 6 Sard's Theorem.- 7 Lie Groups.- 8 Vector Fields.- 9 Integral Curves
Subgradient Algorithm on Riemannian Manifolds
The subgradient method is generalized to the context of Riemannian manifolds. The motivation can be seen in non-Euclidean metrics that occur in interior-point methods. In that frame, the natural
Convex Analysis and Monotone Operator Theory in Hilbert Spaces
This book provides a largely self-contained account of the main results of convex analysis and optimization in Hilbert space. A concise exposition of related constructive fixed point theory is
...
...