Ontology-Aware Token Embeddings for Prepositional Phrase Attachment

@inproceedings{Dasigi2017OntologyAwareTE,
  title={Ontology-Aware Token Embeddings for Prepositional Phrase Attachment},
  author={Pradeep Dasigi and Waleed Ammar and Chris Dyer and Eduard H. Hovy},
  booktitle={ACL},
  year={2017}
}
Type-level word embeddings use the same set of parameters to represent all instances of a word regardless of its context, ignoring the inherent lexical ambiguity in language. Instead, we embed semantic concepts (or synsets) as defined in WordNet and represent a word token in a particular context by estimating a distribution over relevant semantic concepts. We use the new, context-sensitive embeddings in a model for predicting prepositional phrase(PP) attachments and jointly learn the concept… CONTINUE READING
5
Twitter Mentions

Citations

Publications citing this paper.
SHOWING 1-10 OF 14 CITATIONS

References

Publications referenced by this paper.
SHOWING 1-10 OF 27 REFERENCES

Exploring Compositional Architectures and Word Vector Representations for Prepositional Phrase Attachment

VIEW 18 EXCERPTS
HIGHLY INFLUENTIAL

Retrofitting word vectors to semantic lexicons

  • Manaal Faruqui, Jesse Dodge, +3 authors Noah A. Smith.
  • NAACL.
  • 2015
VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

Keras

  • François Chollet.
  • https://github. com/fchollet/keras.
  • 2015
VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL

Linear Algebraic Structure of Word Senses, with Applications to Polysemy

VIEW 1 EXCERPT

A Linear Dynamical System Model for Text

VIEW 1 EXCERPT