Learning Word Meta-Embeddings

@inproceedings{Yin2016LearningWM,
  title={Learning Word Meta-Embeddings},
  author={Wenpeng Yin and Hinrich Sch{\"u}tze},
  booktitle={ACL},
  year={2016}
}
Word embeddings – distributed representations of words – in deep learning are beneficial for many tasks in NLP. However, different embedding sets vary greatly in quality and characteristics of the captured information. Instead of relying on a more advanced algorithm for embedding learning, this paper proposes an ensemble approach of combining different public embedding sets with the aim of learning metaembeddings. Experiments on word similarity and analogy tasks and on part-of-speech tagging… CONTINUE READING
Highly Cited
This paper has 20 citations. REVIEW CITATIONS

From This Paper

Figures, tables, and topics from this paper.

Citations

Publications citing this paper.
Showing 1-10 of 15 extracted citations

Adapting Word Embeddings from Multiple Domains to Symptom Recognition from Psychiatric Notes

AMIA Joint Summits on Translational Science proceedings. AMIA Joint Summits on Translational Science • 2018

References

Publications referenced by this paper.
Showing 1-10 of 34 references

Multimodal Distributional Semantics

J. Artif. Intell. Res. • 2014
View 5 Excerpts
Highly Influenced

Similar Papers

Loading similar papers…