Efficient Globally Convergent Stochastic Optimization for Canonical Correlation Analysis

@inproceedings{Wang2016EfficientGC,
  title={Efficient Globally Convergent Stochastic Optimization for Canonical Correlation Analysis},
  author={Weiran Wang and Jialei Wang and Dan Garber and Nathan Srebro},
  booktitle={NIPS},
  year={2016}
}
We study the stochastic optimization of canonical correlation analysis (CCA), whose objective is nonconvex and does not decouple over training samples. Although several stochastic gradient based optimization algorithms have been recently proposed to solve this problem, no global convergence guarantee was provided by any of them. Inspired by the alternating least squares/power iterations formulation of CCA, and the shift-and-invert preconditioning method for PCA, we propose two globally… CONTINUE READING

References

Publications referenced by this paper.
Showing 1-10 of 25 references

Stochastic optimization for deep CCA via nonlinear orthogonal iterations

2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton) • 2015
View 4 Excerpts

Similar Papers

Loading similar papers…