Benefitting from the Variables that Variable Selection Discards

  title={Benefitting from the Variables that Variable Selection Discards},
  author={Rich Caruana and Virginia R. de Sa},
  journal={Journal of Machine Learning Research},
In supervised learning variable selection is used to find a subset of the available inputs that accurately predict the output. This paper shows that some of the variables that variable selection discards can beneficially be used as extra outputs for inductive transfer. Using discarded input variables as extra outputs forces the model to learn mappings from the variables that were selected as inputs to these extra outputs. Inductive transfer makes what is learned by these mappings available to… CONTINUE READING


Publications citing this paper.
Showing 1-10 of 44 extracted citations

Application of Random Forest Algorithm on Feature Subset Selection and Classification and Regression

2017 World Congress on Computing and Communication Technologies (WCCCT) • 2017
View 1 Excerpt

Robust Joint Graph Sparse Coding for Unsupervised Spectral Feature Selection

IEEE Transactions on Neural Networks and Learning Systems • 2017
View 1 Excerpt


Publications referenced by this paper.
Showing 1-10 of 18 references

Multi-task svms. presented at the Machines that Learn Workshop, Snowbird 2002 and available electronically at jebara/multitask.html

Tony Jebara
View 2 Excerpts

Multitask Learning

Machine Learning • 1997
View 1 Excerpt

. C . Cortes and V . Vapnik . Support - vector networks

Virginia R. de Sa
Machine Learning • 1995

A comparison of id 3 and backpropagation for english text - to - speech mapping

Joumana Ghosn, Yoshua Bengio
Machine Learning • 1995

Category learning through multimodality sensing

Morgan Kaufmann

Similar Papers

Loading similar papers…