Corpus ID: 220425529

Principal Word Vectors

@article{Basirat2020PrincipalWV,
  title={Principal Word Vectors},
  author={A. Basirat and Christian Hardmeier and Joakim Nivre},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.04629}
}
  • A. Basirat, Christian Hardmeier, Joakim Nivre
  • Published 2020
  • Computer Science
  • ArXiv
  • We generalize principal component analysis for embedding words into a vector space. The generalization is made in two major levels. The first is to generalize the concept of corpus as a counting process which is defined by three key elements vocabulary set, feature (annotation) set, and context. This generalization enables the principal word embedding method to generate word vectors with regard to different types of contexts and different types of annotations provided for a corpus. The second… CONTINUE READING
    Random Word Vectors

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 106 REFERENCES
    Linguistic Regularities in Continuous Space Word Representations
    2466
    Neural Word Embedding as Implicit Matrix Factorization
    1188
    Dimensions of meaning
    415
    Efficient Estimation of Word Representations in Vector Space
    14782
    What's in an Embedding? Analyzing Word Embeddings through Multilingual Evaluation
    38