Dimensionality reduction based on non-parametric mutual information

  title={Dimensionality reduction based on non-parametric mutual information},
  author={Lev Faivishevsky and Jacob Goldberger},
In this paper we introduce a supervised linear dimensionality reduction algorithm which finds a projected input space that maximizes the mutual information between input and output values. The algorithm utilizes the recently introduced MeanNN estimator for differential entropy. We show that the estimator is an appropriate tool for the dimensionality reduction task. Next we provide a nonlinear regression algorithm based on the proposed dimensionality reduction approach. The regression algorithm… CONTINUE READING

From This Paper

Figures, tables, and topics from this paper.


Publications referenced by this paper.
Showing 1-10 of 23 references

Relations between two sets of variates

H. Hotelling
Biometrika • 1936
View 4 Excerpts
Highly Influenced

Artes-Rodriguez, Maximization of mutual information for supervised linear feature extraction

A.J.M. Leiva-Murillo
IEEE Trans. Neural Networks • 2007
View 5 Excerpts
Highly Influenced

Mutual Information Feature Extractors for Neural Classiiers

Joydeep GhoshDepartment
View 6 Excerpts
Highly Influenced

Gaussian Processes for Machine Learning

Advanced Lectures on Machine Learning • 2009
View 1 Excerpt

Fast Semi-Supervised Discriminative Component Analysis

2007 IEEE Workshop on Machine Learning for Signal Processing • 2007
View 1 Excerpt

Similar Papers

Loading similar papers…