Jointly Learning Word Representations and Composition Functions Using Predicate-Argument Structures
@inproceedings{Hashimoto2014JointlyLW, title={Jointly Learning Word Representations and Composition Functions Using Predicate-Argument Structures}, author={Kazuma Hashimoto and Pontus Stenetorp and Makoto Miwa and Yoshimasa Tsuruoka}, booktitle={EMNLP}, year={2014} }
We introduce a novel compositional language model that works on PredicateArgument Structures (PASs). Our model jointly learns word representations and their composition functions using bagof-words and dependency-based contexts. Unlike previous word-sequencebased models, our PAS-based model composes arguments into predicates by using the category information from the PAS. This enables our model to capture longrange dependencies between words and to better handle constructs such as verbobject and… CONTINUE READING
Figures, Tables, and Topics from this paper
36 Citations
Adaptive Joint Learning of Compositional and Non-Compositional Phrase Embeddings
- Computer Science
- ACL
- 2016
- 24
- PDF
Syntax-Aware Multi-Sense Word Embeddings for Deep Compositional Models of Meaning
- Computer Science
- EMNLP
- 2015
- 69
- PDF
Improving Sparse Word Representations with Distributional Inference for Semantic Composition
- Computer Science
- EMNLP
- 2016
- 14
- Highly Influenced
- PDF
Comparison Study on Critical Components in Composition Model for Phrase Representation
- Computer Science
- ACM Trans. Asian Low Resour. Lang. Inf. Process.
- 2017
- 9
- PDF
Dependency Link Embeddings: Continuous Representations of Syntactic Substructures
- Computer Science
- VS@HLT-NAACL
- 2015
- 7
- PDF
Learning Semantically and Additively Compositional Distributional Representations
- Computer Science
- ACL
- 2016
- 15
- PDF
Task-Oriented Learning of Word Embeddings for Semantic Relation Classification
- Computer Science
- CoNLL
- 2015
- 45
- PDF
References
SHOWING 1-10 OF 35 REFERENCES
Modeling and Learning Semantic Co-Compositionality through Prototype Projections and Neural Networks
- Computer Science
- EMNLP
- 2013
- 22
- Highly Influential
- PDF
A Comparison of Vector-based Representations for Semantic Composition
- Computer Science
- EMNLP-CoNLL
- 2012
- 266
- PDF
Improving Word Representations via Global Context and Multiple Word Prototypes
- Computer Science
- ACL
- 2012
- 1,089
- PDF
Semantic Compositionality through Recursive Matrix-Vector Spaces
- Computer Science
- EMNLP-CoNLL
- 2012
- 1,163
- Highly Influential
- PDF
Prior Disambiguation of Word Tensors for Constructing Sentence Vectors
- Computer Science
- EMNLP
- 2013
- 69
- PDF
Grounded Compositional Semantics for Finding and Describing Images with Sentences
- Computer Science
- Transactions of the Association for Computational Linguistics
- 2014
- 701
- PDF
Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
- Computer Science
- EMNLP
- 2013
- 4,166
- Highly Influential
- PDF
Contextualizing Semantic Representations Using Syntactically Enriched Vector Models
- Computer Science
- ACL
- 2010
- 114
- PDF
Word Representations: A Simple and General Method for Semi-Supervised Learning
- Computer Science
- ACL
- 2010
- 2,008
- PDF
Distributed Representations of Words and Phrases and their Compositionality
- Computer Science, Mathematics
- NIPS
- 2013
- 20,802
- Highly Influential
- PDF