Corpus ID: 14383462

Shallow Discourse Parsing Using Distributed Argument Representations and Bayesian Optimization

@article{Akanksha2016ShallowDP,
  title={Shallow Discourse Parsing Using Distributed Argument Representations and Bayesian Optimization},
  author={Akanksha and Jacob Eisenstein},
  journal={ArXiv},
  year={2016},
  volume={abs/1606.04503}
}
  • Akanksha, Jacob Eisenstein
  • Published 2016
  • Computer Science
  • ArXiv
  • This paper describes the Georgia Tech team's approach to the CoNLL-2016 supplementary evaluation on discourse relation sense classification. We use long short-term memories (LSTM) to induce distributed representations of each argument, and then combine these representations with surface features in a neural network. The architecture of the neural network is determined by Bayesian hyperparameter search. 
    1 Citations

    Figures, Tables, and Topics from this paper.

    A Systematic Study of Neural Discourse Models for Implicit Discourse Relation
    • 23
    • PDF

    References

    SHOWING 1-10 OF 21 REFERENCES
    Recursive Deep Models for Discourse Parsing
    • 106
    • PDF
    A Refined End-to-End Discourse Parser
    • 51
    • PDF
    Representation Learning for Text-level Discourse Parsing
    • 157
    • PDF
    Recognizing Implicit Discourse Relations in the Penn Discourse Treebank
    • 247
    • PDF
    When Are Tree Structures Necessary for Deep Learning of Representations?
    • 184
    • PDF
    An Unsupervised Approach to Recognizing Discourse Relations
    • 404
    • PDF
    The CoNLL-2015 Shared Task on Shallow Discourse Parsing
    • 116
    • PDF
    Comparing Word Representations for Implicit Discourse Relation Classification
    • 55
    • PDF
    Easily Identifiable Discourse Relations
    • 137
    • PDF
    Automatic sense prediction for implicit discourse relations in text
    • 260
    • PDF