Dimensionality reduction by unsupervised regression

Abstract

We consider the problem of dimensionality reduction, where given high-dimensional data we want to estimate two mappings: from high to low dimension (dimensionality reduction) and from low to high dimension (reconstruction). We adopt an unsupervised regression point of view by introducing the unknown low-dimensional coordinates of the data as parameters, and formulate a regularised objective functional of the mappings and low-dimensional coordinates. Alternating minimisation of this functional is straightforward: for fixed low-dimensional coordinates, the mappings have a unique solution; and for fixed mappings, the coordinates can be obtained by finite-dimensional non-linear minimisation. Besides, the coordinates can be initialised to the output of a spectral method such as Laplacian eigenmaps. The model generalises PCA and several recent methods that learn one of the two mappings but not both; and, unlike spectral methods, our model provides out-of-sample mappings by construction. Experiments with toy and real-world problems show that the model is able to learn mappings for convoluted manifolds, avoiding bad local optima that plague other methods.

DOI: 10.1109/CVPR.2008.4587666

Extracted Key Phrases

4 Figures and Tables

Cite this paper

@article{CarreiraPerpin2008DimensionalityRB, title={Dimensionality reduction by unsupervised regression}, author={Miguel {\'A}. Carreira-Perpi{\~n}{\'a}n and Zhengdong Lu}, journal={2008 IEEE Conference on Computer Vision and Pattern Recognition}, year={2008}, pages={1-8} }