Modeling Order in Neural Word Embeddings at Scale

@inproceedings{Trask2015ModelingOI,
  title={Modeling Order in Neural Word Embeddings at Scale},
  author={Andrew Trask and David Gilmore and Matthew Russell},
  booktitle={ICML},
  year={2015}
}
Natural Language Processing (NLP) systems commonly leverage bag-of-words co-occurrence techniques to capture semantic and syntactic word relationships. The resulting word-level distributed representations often ignore morphological information, though character-level embeddings have proven valuable to NLP tasks. We propose a new neural language model incorporating both word order and character order in its embedding. The model produces several vector spaces with meaningful substructure, as… CONTINUE READING
Highly Cited
This paper has 23 citations. REVIEW CITATIONS
Recent Discussions
This paper has been referenced on Twitter 45 times over the past 90 days. VIEW TWEETS