• Corpus ID: 211011308

Gen-Oja: A Two-time-scale approach for Streaming CCA

@article{Bhatia2018GenOjaAT,
  title={Gen-Oja: A Two-time-scale approach for Streaming CCA},
  author={Kush Bhatia and Aldo Pacchiano and Nicolas Flammarion and Peter L. Bartlett and Michael I. Jordan},
  journal={arXiv: Learning},
  year={2018}
}
In this paper, we study the problems of principal Generalized Eigenvector computation and Canonical Correlation Analysis in the stochastic setting. We propose a simple and efficient algorithm, Gen-Oja, for these problems. We prove the global convergence of our algorithm, borrowing ideas from the theory of fast-mixing Markov chains and two-time-scale stochastic approximation, showing that it achieves the optimal rate of convergence. In the process, we develop tools for understanding stochastic… 
1 Citations
Incremental Canonical Correlation Analysis
TLDR
An incremental canonical correlation analysis is proposed, which maintains in an adaptive manner a constant memory storage for both the mean and covariance matrices and saves overhead time by using sequential singular value decomposition (SVD), which is still efficient in case when the number of samples is sufficiently few.

References

SHOWING 1-10 OF 24 REFERENCES
Stochastic Approximation for Canonical Correlation Analysis
TLDR
Novel first-order stochastic approximation algorithms for canonical correlation analysis (CCA) are proposed that achieve-suboptimality in the population objective in $\operatorname{poly}(\frac{1}{\epsilon})$ iterations.
Efficient Globally Convergent Stochastic Optimization for Canonical Correlation Analysis
TLDR
This work proposes two globally convergent meta-algorithms for CCA, both of which transform the original problem into sequences of least squares problems that need only be solved approximately and obtain time complexities that significantly improve upon that of previous work.
Efficient Algorithms for Large-scale Generalized Eigenvector Computation and Canonical Correlation Analysis
TLDR
This paper considers the problem of canonical-correlation analysis (CCA) and, more broadly, the generalized eigenvector problem for a pair of symmetric matrices and provides simple iterative algorithms for solving these problems that are globally linearly convergent with moderate dependencies on the condition numbers and eigenvalue gaps of the matrices involved.
The Noisy Power Method: A Meta Algorithm with Applications
TLDR
A new robust convergence analysis of the well-known power method for computing the dominant singular vectors of a matrix that is called the noisy power method is provided and shows that the error dependence of the algorithm on the matrix dimension can be replaced by an essentially tight dependence on the coherence of the matrix.
Bridging the gap between constant step size stochastic gradient descent and Markov chains
TLDR
It is shown that Richardson-Romberg extrapolation may be used to get closer to the global optimum and an explicit asymptotic expansion of the moments of the averaged SGD iterates that outlines the dependence on initial conditions, the effect of noise and the step-size, as well as the lack of convergence in the general case.
Large Scale Canonical Correlation Analysis with Iterative Least Squares
Canonical Correlation Analysis (CCA) is a widely used statistical tool with both well established theory and favorable performance for a wide range of machine learning problems. However, computing
Convergence of Stochastic Gradient Descent for PCA
TLDR
This paper provides the first eigengap-free convergence guarantees for SGD in the context of PCA in a streaming stochastic setting, and shows that the same techniques lead to new SGD convergence guarantees with better dependence on the eIGengap.
Non-strongly-convex smooth stochastic approximation with convergence rate O(1/n)
We consider the stochastic approximation problem where a convex function has to be minimized, given only the knowledge of unbiased estimates of its gradients at certain points, a framework which
Sparse CCA: Adaptive Estimation and Computational Barriers
TLDR
It is shown that a sample size condition is needed for any randomized polynomial-time estimator to be consistent, assuming hardness of certain instances of the Planted Clique detection problem.
Stochastic Canonical Correlation Analysis
TLDR
The sample complexity of canonical correlation analysis is studied to show that, given an estimate of the canonical correlation, the streaming version of the shift-and-invert power iterations achieves the same learning accuracy with the same level of sample complexity, by processing the data only once.
...
...