Corpus ID: 218487233

Riemannian Stochastic Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold

@article{Wang2020RiemannianSP,
  title={Riemannian Stochastic Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold},
  author={Bokun Wang and Shiqian Ma and Lingzhou Xue},
  journal={ArXiv},
  year={2020},
  volume={abs/2005.01209}
}
Riemannian optimization has drawn a lot of attention due to its wide applications in practice. Riemannian stochastic first-order algorithms have been studied in the literature to solve large-scale machine learning problems over Riemannian manifolds. However, most of the existing Riemannian stochastic algorithms require the objective function to be differentiable, and they do not apply to the case where the objective function is nonsmooth. In this paper, we present two Riemannian stochastic… Expand

References

SHOWING 1-10 OF 40 REFERENCES
Faster First-Order Methods for Stochastic Non-Convex Optimization on Riemannian Manifolds
Global rates of convergence for nonconvex optimization on manifolds
Proximal Gradient Method for Manifold Optimization
First-order Methods for Geodesically Convex Optimization
A Regularized Semi-Smooth Newton Method with Projection Steps for Composite Convex Programs
Stochastic Gradient Descent on Riemannian Manifolds
  • S. Bonnabel
  • Mathematics, Computer Science
  • IEEE Transactions on Automatic Control
  • 2013
...
1
2
3
4
...