Corpus ID: 52913094

McTorch, a manifold optimization library for deep learning

@article{Meghwanshi2018McTorchAM,
  title={McTorch, a manifold optimization library for deep learning},
  author={Mayank Meghwanshi and Pratik Jawanpuria and Anoop Kunchukuttan and Hiroyuki Kasai and Bamdev Mishra},
  journal={ArXiv},
  year={2018},
  volume={abs/1810.01811}
}
In this paper, we introduce McTorch, a manifold optimization library for deep learning that extends PyTorch. It aims to lower the barrier for users wishing to use manifold constraints in deep learning applications, i.e., when the parameters are constrained to lie on a manifold. Such constraints include the popular orthogonality and rank constraints, and have been recently used in a number of applications in deep learning. McTorch follows PyTorch's architecture and decouples manifold definitions… Expand
Manifold optimization for optimal transport
TLDR
This work discusses optimization-related ingredients that allow modeling the OT problem on smooth Riemannian manifolds by exploiting the geometry of the search space and makes available the Manifold optimization-based Optimal Transport repository with codes useful in solving OT problems in Python and Matlab. Expand
A Dual Framework for Low-rank Tensor Completion
TLDR
This work proposes a variant of the latent trace norm that helps in learning a non-sparse combination of tensors, and develops a dual framework for solving the low-rank tensor completion problem. Expand
Sliced Gromov-Wasserstein
TLDR
A novel OT discrepancy is defined that can deal with large scale distributions via a slicing approach and is demonstrated to have ability to tackle similar problems as GW while being several order of magnitudes faster to compute. Expand
The Sparse Recovery Autoencoder
TLDR
A new method to learn linear encoders that adapt to data, while still performing well with the widely used $\ell_1$ decoder is presented, based on the insight that unfolding the convex decoder into projected gradient steps can address this issue. Expand
Geometry-aware domain adaptation for unsupervised alignment of word embeddings
TLDR
A novel manifold based geometric approach for learning unsupervised alignment of word embeddings between the source and the target languages by formulating the alignment learning problem as a domain adaptation problem over the manifold of doubly stochastic matrices. Expand
TpG Geoopt: Riemannian Optimization in PyTorch
Geoopt is a research-oriented modular opensource package for Riemannian Optimization in PyTorch. The core of Geoopt is a standard Manifold interface that allows for the generic implementation ofExpand
Introduction to Geometric Learning in Python with Geomstats
TLDR
The open-source Python package geomstats is presented and hands-on tutorials for differential geometry and geometric machine learning algorithms-Geometric Learning-that rely on it are introduced. Expand
Adaptive stochastic gradient algorithms on Riemannian manifolds
TLDR
Novel stochastic gradient algorithms are proposed for problems on Riemannian manifolds by adapting the row and column subspaces of gradients and they achieve the convergence rate of order ${O}(\log (T)/\sqrt{T})$, where $T$ is the number of iterations. Expand
Riemannian adaptive stochastic gradient algorithms on matrix manifolds
TLDR
This work proposes novel stochastic gradient algorithms for problems on Riemannian matrix manifolds by adapting the row and column subspaces of gradients and achieves the convergence rate of order $\mathcal{O}(\log (T)/\sqrt{T})$, where $T$ is the number of iterations. Expand
Learning a Compressed Sensing Measurement Matrix via Gradient Unrolling
TLDR
A new method to learn linear encoders that adapt to data, while still performing well with the widely used $\ell_1$ decoder is presented, based on the insight that unrolling the convex decoder into $T$ projected subgradient steps can address this issue. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 35 REFERENCES
Geometry Aware Constrained Optimization Techniques for Deep Learning
TLDR
This paper generalizes the Stochastic Gradient Descent (SGD) and RMSProp algorithms to the setting of Riemannian optimization, and substantiates their proposed extensions with a range of relevant problems in machine learning such as incremental Principal Component Analysis, computating the RiemANNian centroids of SPD matrices, and Deep Metric Learning. Expand
A Unified Framework for Structured Low-rank Matrix Learning
We propose a novel optimization framework for learning a low-rank matrix which is also constrained to lie in a linear subspace. Exploiting the duality theory, we present a factorization thatExpand
Building Deep Networks on Grassmann Manifolds
TLDR
This paper proposes a deep network architecture by generalizing the Euclidean network paradigm to Grassmann manifolds and designs full rank mapping layers to transform input Grassmannian data to more desirable ones, and exploits re-orthonormalization layers to normalize the resulting matrices. Expand
A Dual Framework for Low-rank Tensor Completion
TLDR
This work proposes a variant of the latent trace norm that helps in learning a non-sparse combination of tensors, and develops a dual framework for solving the low-rank tensor completion problem. Expand
Matrix Manifold Optimization for Gaussian Mixtures
TLDR
This work advances Riemannian manifold optimization (on the manifold of positive definite matrices) as a potential replacement for Expectation Maximization (EM) and develops a well-tuned Riemansian LBFGS method that proves superior to known competing methods (e.g., Riemanian conjugate gradient). Expand
Automatic differentiation in PyTorch
TLDR
An automatic differentiation module of PyTorch is described — a library designed to enable rapid research on machine learning models that focuses on differentiation of purely imperative programs, with a focus on extensibility and low overhead. Expand
Symmetry-invariant optimization in deep networks
TLDR
This work shows that commonly used deep networks, such as those which use a max-pooling and sub-sampling layer, possess more complex forms of symmetry arising from scaling based reparameterization of the network weights. Expand
Pymanopt: A Python Toolbox for Optimization on Manifolds using Automatic Differentiation
TLDR
Pymanopt is introduced, a toolbox for optimization on manifolds, implemented in Python, that---similarly to the Manopt Matlab toolbox---implements several manifold geometries and optimization algorithms. Expand
Manopt, a matlab toolbox for optimization on manifolds
TLDR
The Manopt toolbox, available at www.manopt.org, is a user-friendly, documented piece of software dedicated to simplify experimenting with state of the art Riemannian optimization algorithms, which aims particularly at lowering the entrance barrier. Expand
MADMM: A Generic Algorithm for Non-smooth Optimization on Manifolds
TLDR
This paper proposes the Manifold Alternating Directions Method of Multipliers (MADMM), an extension of the classical ADMM scheme for manifold-constrained non-smooth optimization problems, and is the first generic non-Smooth manifold optimization method. Expand
...
1
2
3
4
...