Corpus ID: 44218327

A Unified Framework for Structured Low-rank Matrix Learning

@inproceedings{Jawanpuria2018AUF,
  title={A Unified Framework for Structured Low-rank Matrix Learning},
  author={Pratik Jawanpuria and Bamdev Mishra},
  booktitle={ICML},
  year={2018}
}
We propose a novel optimization framework for learning a low-rank matrix which is also constrained to lie in a linear subspace. Exploiting the duality theory, we present a factorization that decouples the low-rank and structural constraints onto separate factors. The optimization problem is formulated on the Riemannian spectrahedron manifold, where the Riemannian framework allows to develop computationally efficient conjugate gradient and trust-region algorithms. Our approach easily… Expand
Structured low-rank matrix learning: algorithms and applications
TLDR
This work considers the problem of learning a low-rank matrix, constrained to lie in a linear subspace, and introduces a novel factorization for modeling such matrices, and formulate the optimization problem on the Riemannian spectrahedron manifold. Expand
A Dual Framework for Low-rank Tensor Completion
TLDR
This work proposes a variant of the latent trace norm that helps in learning a non-sparse combination of tensors, and develops a dual framework for solving the low-rank tensor completion problem. Expand
Beyond Alternating Updates for Matrix Factorization with Inertial Bregman Proximal Gradient Algorithms
TLDR
For non-alternating schemes, such as the recently introduced Bregman Proximal Gradient (BPG) method and an inertial variant Convex--Concave Inertial BPG, convergence of the whole sequence to a stationary point is proved for Matrix Factorization. Expand
Riemannian adaptive stochastic gradient algorithms on matrix manifolds
TLDR
This work proposes novel stochastic gradient algorithms for problems on Riemannian matrix manifolds by adapting the row and column subspaces of gradients and achieves the convergence rate of order $\mathcal{O}(\log (T)/\sqrt{T})$, where $T$ is the number of iterations. Expand
McTorch, a manifold optimization library for deep learning
TLDR
McTorch is introduced, a manifold optimization library for deep learning that extends PyTorch that decouples manifold definitions and optimizers, i.e., once a new manifold is added it can be used with any existing optimizer and vice-versa. Expand
Protein structure determination using Riemannian approach
TLDR
By combining the Riemannian approach and some post-processing procedures together, the protein structures from the incomplete distance information measured by NMR are reconstructed and have similar (sometimes even smaller) RMSDs relative to “standard” X-ray structures in contrast with the reference NMR structures. Expand
Personalized Compatibility Metric Learning
Recommending sets of items that include both personalized and compatible items is crucial to personalized styling programs such as Amazon’s Personal Shopper. There is both an extensive literature onExpand
Scaling Up Collaborative Filtering Data Sets through Randomized Fractal Expansions
TLDR
The proposed method adapts Kronecker Graph Theory to preserve key higher order statistical properties such as the fat-tailed distribution of user engagements, item popularity, and singular value spectra of user/item interaction matrices. Expand
Scalable Realistic Recommendation Datasets through Fractal Expansions
TLDR
This work adapts the Kronecker Graph Theory to user/item incidence matrices and shows that the corresponding fractal expansions preserve the fat-tailed distributions of user engagements, item popularity and singular value spectra of user/ item interaction matrices. Expand
Protein structure determination using a Riemannian approach
TLDR
A novel method based on a matrix completion technique – the Riemannian approach – to rebuild the protein structure from the nuclear Overhauser effect distance restraints and the dihedral angle restraints is adopted. Expand
...
1
2
...

References

SHOWING 1-10 OF 71 REFERENCES
Structured low-rank matrix learning: algorithms and applications
TLDR
This work considers the problem of learning a low-rank matrix, constrained to lie in a linear subspace, and introduces a novel factorization for modeling such matrices, and formulate the optimization problem on the Riemannian spectrahedron manifold. Expand
A Saddle Point Approach to Structured Low-rank Matrix Learning in Large-scale Applications
TLDR
The numerical comparisons show that the proposed algorithms outperform state-of-the-art algorithms in standard and robust matrix completion, stochastic realization, and multi-task feature learning problems on various benchmarks. Expand
Fixed-rank matrix factorizations and Riemannian low-rank optimization
TLDR
Numerical experiments suggest that the proposed algorithms compete with state-of-the-art algorithms and that manifold optimization offers an effective and versatile framework for the design of machine learning algorithms that learn a fixed-rank matrix. Expand
Robust Low-Rank Matrix Completion by Riemannian Optimization
TLDR
This paper proposes RMC, a new method to deal with the problem of robust low-rank matrix completion, i.e., matrix completion where a fraction of the observed entries are corrupted by non-Gaussian noise, typically outliers. Expand
Low-Rank Optimization with Trace Norm Penalty
TLDR
The paper addresses the problem of low-rank trace norm minimization with an algorithm that alternates between fixed-rank optimization and rank-one updates and presents a second-order trust-region algorithm with a guaranteed quadratic rate of convergence. Expand
Efficient Structured Matrix Rank Minimization
TLDR
Numerical results show that this approach significantly outperforms state-of-the-art competitors in terms of running time, while effectively recovering low rank solutions in stochastic system realization and spectral compressed sensing problems. Expand
R3MC: A Riemannian three-factor algorithm for low-rank matrix completion
TLDR
Numerical comparisons suggest that R3MC robustly outperforms state-of-the-art algorithms across different problem instances, especially those that combine scarcely sampled and ill-conditioned data. Expand
Low-rank matrix completion via preconditioned optimization on the Grassmann manifold
We address the numerical problem of recovering large matrices of low rank when most of the entries are unknown. We exploit the geometry of the low-rank constraint to recast the problem as anExpand
Proximal Riemannian Pursuit for Large-Scale Trace-Norm Minimization
TLDR
A proximal Riemannian pursuit paradigm is proposed which addresses a sequence of trace-norm regularized subproblems defined on nonlinear matrix varieties, in which the SVDs of intermediate solutions are maintained by cheap low-rank QR decompositions, therefore making the proposed method more scalable. Expand
Low-Rank Matrix Completion by Riemannian Optimization
TLDR
This work proposes a new algorithm for matrix completion that minimizes the least-square distance on the sampling set over the Riemannian manifold of fixed-rank matrices and proves convergence of a regularized version of the algorithm under the assumption that the restricted isometry property holds for incoherent matrices throughout the iterations. Expand
...
1
2
3
4
5
...