Automatic Selection of Context Configurations for Improved Class-Specific Word Representations

@article{Vulic2016AutomaticSO,
  title={Automatic Selection of Context Configurations for Improved Class-Specific Word Representations},
  author={Ivan Vulic and Roy Schwartz and Ari Rappoport and Roi Reichart and Anna Korhonen},
  journal={ArXiv},
  year={2016},
  volume={abs/1608.05528}
}
Recent work has demonstrated that state-of-the-art word embedding models require different context types to produce high-quality representations for different word classes such as adjectives (A), verbs (V), and nouns (N). This paper is concerned with identifying contexts useful for learning A/V/N-specific representations. We introduce a simple yet effective framework for selecting class-specific context configurations that yield improved representations for each class. We propose an automatic A… CONTINUE READING
1
Twitter Mention

Figures, Tables, and Topics from this paper.

References

Publications referenced by this paper.
SHOWING 1-10 OF 51 REFERENCES

Crosslingual and Multilingual Construction of Syntax-Based Vector Space Models

VIEW 9 EXCERPTS
HIGHLY INFLUENTIAL

Dependency-Based Construction of Semantic Space Models

VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL

From Paraphrase Database to Compositional Paraphrase Model and Back

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Improving Distributional Similarity with Lessons Learned from Word Embeddings

VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

ral word embedding as implicit matrix factorization

  • Omer Levy, Yoav Goldberg, Ido Dagan
  • 2015
VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Dependency-Based Word Embeddings

VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL