Ontology-Aware Token Embeddings for Prepositional Phrase Attachment

@inproceedings{Dasigi2017OntologyAwareTE,
  title={Ontology-Aware Token Embeddings for Prepositional Phrase Attachment},
  author={Pradeep Dasigi and Waleed Ammar and Chris Dyer and E. Hovy},
  booktitle={ACL},
  year={2017}
}
Type-level word embeddings use the same set of parameters to represent all instances of a word regardless of its context, ignoring the inherent lexical ambiguity in language. Instead, we embed semantic concepts (or synsets) as defined in WordNet and represent a word token in a particular context by estimating a distribution over relevant semantic concepts. We use the new, context-sensitive embeddings in a model for predicting prepositional phrase(PP) attachments and jointly learn the concept… Expand
Embedding Syntax and Semantics of Prepositions via Tensor Decomposition
PoKED: A Semi-Supervised System for Word Sense Disambiguation
Learning word hierarchical representations with neural networks for document modeling
...
1
2
3
...

References

SHOWING 1-10 OF 27 REFERENCES
Ontologically Grounded Multi-sense Representation Learning for Semantic Vector Space Models
AutoExtend: Extending Word Embeddings to Embeddings for Synsets and Lexemes
Exploring Compositional Architectures and Word Vector Representations for Prepositional Phrase Attachment
Improving Word Representations via Global Context and Multiple Word Prototypes
Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space
Linear Algebraic Structure of Word Senses, with Applications to Polysemy
Glove: Global Vectors for Word Representation
Word Representations via Gaussian Embedding
...
1
2
3
...