Learning Word Meta-Embeddings by Using Ensembles of Embedding Sets

@article{Yin2015LearningWM,
  title={Learning Word Meta-Embeddings by Using Ensembles of Embedding Sets},
  author={Wenpeng Yin and Hinrich Sch{\"u}tze},
  journal={CoRR},
  year={2015},
  volume={abs/1508.04257}
}
Word embeddings – distributed representations for words – in deep learning are beneficial for many tasks in Natural Language Processing (NLP). However, different embedding sets vary greatly in quality and characteristics of the captured semantics. Instead of relying on a more advanced algorithm for embedding learning, this paper proposes an ensemble approach of combining different public embedding sets with the aim of learning meta-embeddings . Experiments on word similarity and analogy tasks… CONTINUE READING
6 Citations
29 References
Similar Papers

Similar Papers

Loading similar papers…