Learning Word Meta-Embeddings by Using Ensembles of Embedding Sets

  title={Learning Word Meta-Embeddings by Using Ensembles of Embedding Sets},
  author={Wenpeng Yin and Hinrich Sch{\"u}tze},
Word embeddings – distributed representations for words – in deep learning are beneficial for many tasks in Natural Language Processing (NLP). However, different embedding sets vary greatly in quality and characteristics of the captured semantics. Instead of relying on a more advanced algorithm for embedding learning, this paper proposes an ensemble approach of combining different public embedding sets with the aim of learning meta-embeddings . Experiments on word similarity and analogy tasks… CONTINUE READING
6 Citations
29 References
Similar Papers

Similar Papers

Loading similar papers…