• Corpus ID: 246442027

Towards a Theoretical Understanding of Word and Relation Representation

@article{Allen2022TowardsAT,
  title={Towards a Theoretical Understanding of Word and Relation Representation},
  author={Carl Allen},
  journal={ArXiv},
  year={2022},
  volume={abs/2202.00486}
}
  • Carl Allen
  • Published 1 February 2022
  • Computer Science
  • ArXiv
Representing words by vectors, or embeddings, enables computational reasoning and is foundational to automating natural language tasks. For example, if word embeddings of similar words contain similar values, word similarity can be readily assessed, whereas judging that from their spelling is often impossible (e.g. cat /feline) and to predetermine and store similarities between all words is prohibitively time-consuming, memory intensive and subjective. We focus on word embeddings learned from…