Prepositional Phrase Attachment over Word Embedding Products


We present a low-rank multi-linear model for the task of solving prepositional phrase attachment ambiguity (PP task). Our model exploits tensor products of word embeddings, capturing all possible conjunctions of latent embeddings. Our results on a wide range of datasets and task settings show that tensor products are the best compositional operation and that a relatively simple multi-linear model that uses only word embeddings of lexical features can outperform more complex non-linear architectures that exploit the same information. Our proposed model gives the current best reported performance on an out-of-domain evaluation and performs competively on out-of-domain dependency parsing datasets.

View Slides

9 Figures and Tables

Cite this paper

@inproceedings{Madhyastha2017PrepositionalPA, title={Prepositional Phrase Attachment over Word Embedding Products}, author={Pranava Swaroop Madhyastha and Xavier Carreras and Ariadna Quattoni}, booktitle={IWPT}, year={2017} }