Jointly Learning Word Representations and Composition Functions Using Predicate-Argument Structures

@inproceedings{Hashimoto2014JointlyLW,
  title={Jointly Learning Word Representations and Composition Functions Using Predicate-Argument Structures},
  author={Kazuma Hashimoto and Pontus Stenetorp and Makoto Miwa and Yoshimasa Tsuruoka},
  booktitle={EMNLP},
  year={2014}
}
  • Kazuma Hashimoto, Pontus Stenetorp, +1 author Yoshimasa Tsuruoka
  • Published in EMNLP 2014
  • Computer Science
  • We introduce a novel compositional language model that works on PredicateArgument Structures (PASs). Our model jointly learns word representations and their composition functions using bagof-words and dependency-based contexts. Unlike previous word-sequencebased models, our PAS-based model composes arguments into predicates by using the category information from the PAS. This enables our model to capture longrange dependencies between words and to better handle constructs such as verbobject and… CONTINUE READING
    36 Citations
    Syntax-Aware Multi-Sense Word Embeddings for Deep Compositional Models of Meaning
    • 69
    • PDF
    Model-Free Context-Aware Word Composition
    • PDF
    Improving Sparse Word Representations with Distributional Inference for Semantic Composition
    • 14
    • Highly Influenced
    • PDF
    Comparison Study on Critical Components in Composition Model for Phrase Representation
    • 9
    • PDF
    Dependency Link Embeddings: Continuous Representations of Syntactic Substructures
    • 7
    • PDF
    Learning Semantically and Additively Compositional Distributional Representations
    • 15
    • PDF
    Paraphrase-Supervised Models of Compositionality
    • 1
    • PDF

    References

    SHOWING 1-10 OF 35 REFERENCES
    Modeling and Learning Semantic Co-Compositionality through Prototype Projections and Neural Networks
    • 22
    • Highly Influential
    • PDF
    A Comparison of Vector-based Representations for Semantic Composition
    • 266
    • PDF
    Improving Word Representations via Global Context and Multiple Word Prototypes
    • 1,089
    • PDF
    Semantic Compositionality through Recursive Matrix-Vector Spaces
    • 1,163
    • Highly Influential
    • PDF
    Prior Disambiguation of Word Tensors for Constructing Sentence Vectors
    • 69
    • PDF
    Grounded Compositional Semantics for Finding and Describing Images with Sentences
    • 701
    • PDF
    Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
    • 4,166
    • Highly Influential
    • PDF
    Contextualizing Semantic Representations Using Syntactically Enriched Vector Models
    • 114
    • PDF
    Word Representations: A Simple and General Method for Semi-Supervised Learning
    • 2,008
    • PDF
    Distributed Representations of Words and Phrases and their Compositionality
    • 20,802
    • Highly Influential
    • PDF