In Defense of Word Embedding for Generic Text Representation

  title={In Defense of Word Embedding for Generic Text Representation},
  author={Guy Lev and Benjamin Eliot Klein and Lior Wolf},
  booktitle={NLDB 2015},
Statistical methods have shown a remarkable ability to capture semantics. The word2vec method is a frequently cited method for capturing meaningful semantic relations between words from a large text corpus. It has the advantage of not requiring any tagging while training. The prevailing view is, however, that it is lacks the ability to capture semantics of word sequences and is virtually useless for most purposes, unless combined with heavy machinery. This paper challenges that view, by showing… CONTINUE READING

From This Paper

Topics from this paper.

Similar Papers

Loading similar papers…