Sparse coding for multitask and transfer learning

  title={Sparse coding for multitask and transfer learning},
  author={Andreas Maurer and Massimiliano Pontil and Bernardino Romera-Paredes},
We investigate the use of sparse coding and dictionary learning in the context of multitask and transfer learning. The central assumption of our learning method is that the tasks parameters are well approximated by sparse linear combinations of the atoms of a dictionary on a high or infinite dimensional space. This assumption, together with the large quantity of available data in the multitask and transfer learning settings, allows a principled choice of the dictionary. We provide bounds on the… CONTINUE READING
Highly Cited
This paper has 129 citations. REVIEW CITATIONS
Related Discussions
This paper has been referenced on Twitter 5 times. VIEW TWEETS


Publications citing this paper.
Showing 1-10 of 80 extracted citations

130 Citations

Citations per Year
Semantic Scholar estimates that this publication has 130 citations based on the available data.

See our FAQ for additional information.


Publications referenced by this paper.
Showing 1-10 of 21 references

Proximal Methods for Hierarchical Sparse Coding

Journal of Machine Learning Research • 2011
View 4 Excerpts
Highly Influenced

A model for inductive bias learning

J. Baxter
J. of Artificial Intelligence Research, • 2000
View 4 Excerpts
Highly Influenced

Statistics for HighDimensional Data: Methods, Theory and Applications

P. Bühlmann, S. van de Geer

$K$ -Dimensional Coding Schemes in Hilbert Spaces

IEEE Transactions on Information Theory • 2010
View 3 Excerpts

Transfer bounds for linear feature learning

Machine Learning • 2009
View 5 Excerpts

Convex multitask feature learning

A. Argyriou, T. Evgeniou, M. Pontil
Machine Learning, • 2008

Similar Papers

Loading similar papers…