Transfering Nonlinear Representations using Gaussian Processes with a Shared Latent Space

@inproceedings{Urtasun2007TransferingNR,
  title={Transfering Nonlinear Representations using Gaussian Processes with a Shared Latent Space},
  author={Raquel Urtasun and Ariadna Quattoni and Trevor Darrell},
  year={2007}
}
When a series of problems are related, representations derived from learning earlier tasks may be useful in solving later problems. In this paper we propose a novel approach to transfer learning with low-dimensional, non-linear latent spaces. We show how such representations can be jointly learned across multiple tasks in a discriminative probabilistic regression framework. When transferred to new tasks with relatively few training examples, learning can be faster and/or more accurate… CONTINUE READING

References

Publications referenced by this paper.
Showing 1-10 of 21 references

Quiñonero-Candela. Local distance preservation in the GP-LVM through back constraints

  • J.N.D. Lawrence
  • In International Conference in Machine Learning,
  • 2006
Highly Influential
5 Excerpts

Gaussian Process Models for Visualisation of High Dimensional Data

  • N. D. Lawrence
  • In Neural Information Processing Systems
  • 2004
Highly Influential
7 Excerpts

Similar Papers

Loading similar papers…