Corpus ID: 49397352

Deep Orthogonal Representations: Fundamental Properties and Applications

@article{Hsu2018DeepOR,
  title={Deep Orthogonal Representations: Fundamental Properties and Applications},
  author={Hsiang Hsu and Salman Salamatian and F. Calmon},
  journal={ArXiv},
  year={2018},
  volume={abs/1806.08449}
}
Several representation learning and, more broadly, dimensionality reduction techniques seek to produce representations of the data that are orthogonal (uncorrelated). Examples include PCA, CCA, Kernel/Deep CCA, the ACE algorithm and correspondence analysis (CA). For a fixed data distribution, all finite variance representations belong to the same function space regardless of how they are derived. In this work, we present a theoretical framework for analyzing this function space, and demonstrate… Expand
Correspondence Analysis of Government Expenditure Patterns

References

SHOWING 1-10 OF 41 REFERENCES
Representation Learning: A Review and New Perspectives
Deep Canonical Correlation Analysis
Deep Generalized Canonical Correlation Analysis
Correlational Neural Networks
Multi-view clustering via canonical correlation analysis
On Deep Multi-View Representation Learning
Deep Variational Canonical Correlation Analysis
An information-theoretic approach to universal feature selection in high-dimensional inference
Kernel independent component analysis
  • F. Bach, Michael I. Jordan
  • Mathematics, Computer Science
  • 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03).
  • 2003
Learning Multiple Layers of Features from Tiny Images
...
1
2
3
4
5
...