Jointly Learning Word Representations and Composition Functions Using Predicate-Argument Structures

@inproceedings{Hashimoto2014JointlyLW,
  title={Jointly Learning Word Representations and Composition Functions Using Predicate-Argument Structures},
  author={Kazuma Hashimoto and Pontus Stenetorp and Makoto Miwa and Yoshimasa Tsuruoka},
  booktitle={EMNLP},
  year={2014}
}
We introduce a novel compositional language model that works on PredicateArgument Structures (PASs). Our model jointly learns word representations and their composition functions using bagof-words and dependency-based contexts. Unlike previous word-sequencebased models, our PAS-based model composes arguments into predicates by using the category information from the PAS. This enables our model to capture longrange dependencies between words and to better handle constructs such as verbobject and… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 29 CITATIONS

Improving Sparse Word Representations with Distributional Inference for Semantic Composition

VIEW 7 EXCERPTS
CITES BACKGROUND, METHODS & RESULTS
HIGHLY INFLUENCED

Erratum: “From Paraphrase Database to Compositional Paraphrase Model and Back”

VIEW 4 EXCERPTS
CITES METHODS, RESULTS & BACKGROUND
HIGHLY INFLUENCED

Improving Semantic Composition with Offset Inference

VIEW 4 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

One Representation per Word - Does it make Sense for Composition?

VIEW 3 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Using Part of Speech Tagging for Improving Word2vec Model

VIEW 1 EXCERPT
CITES BACKGROUND

A Neural Approach to Source Dependence Based Context Model for Statistical Machine Translation

References

Publications referenced by this paper.
SHOWING 1-10 OF 33 REFERENCES

Feature Forest Models for Probabilistic HPSG Parsing

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Natural Language Processing (Almost) from Scratch

VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL