Maximum margin transfer learning

@inproceedings{Su2009MaximumMT,
  title={Maximum margin transfer learning},
  author={Bai Su and Yi-Dong Shen},
  booktitle={GEC Summit},
  year={2009}
}
To achieve good generalization in supervised learning, the training and testing examples are usually required to be drawn from the same source distribution. However, in many cases, this identical distribution assumption might be violated when a task from one new domain(target domain) comes, while there are only labeled data from a similar old domain(auxiliary domain). Labeling the new data can be costly and it would also be a waste to throw away all the old data. In this paper, we present a… CONTINUE READING

From This Paper

Figures, tables, and topics from this paper.

Explore Further: Topics Discussed in This Paper

Citations

Publications citing this paper.

References

Publications referenced by this paper.

Similar Papers

Loading similar papers…