Corpus ID: 14383462

Shallow Discourse Parsing Using Distributed Argument Representations and Bayesian Optimization

  title={Shallow Discourse Parsing Using Distributed Argument Representations and Bayesian Optimization},
  author={Akanksha and Jacob Eisenstein},
  • Akanksha, Jacob Eisenstein
  • Published 2016
  • Computer Science
  • ArXiv
  • This paper describes the Georgia Tech team's approach to the CoNLL-2016 supplementary evaluation on discourse relation sense classification. We use long short-term memories (LSTM) to induce distributed representations of each argument, and then combine these representations with surface features in a neural network. The architecture of the neural network is determined by Bayesian hyperparameter search. 
    1 Citations

    Figures, Tables, and Topics from this paper

    A Systematic Study of Neural Discourse Models for Implicit Discourse Relation
    • 24
    • PDF


    Recursive Deep Models for Discourse Parsing
    • 110
    • PDF
    A Refined End-to-End Discourse Parser
    • 53
    • PDF
    Representation Learning for Text-level Discourse Parsing
    • 164
    • PDF
    Recognizing Implicit Discourse Relations in the Penn Discourse Treebank
    • 253
    • PDF
    When Are Tree Structures Necessary for Deep Learning of Representations?
    • 188
    • PDF
    An Unsupervised Approach to Recognizing Discourse Relations
    • 409
    • PDF
    The CoNLL-2015 Shared Task on Shallow Discourse Parsing
    • 118
    • PDF
    Comparing Word Representations for Implicit Discourse Relation Classification
    • 56
    • PDF
    Easily Identifiable Discourse Relations
    • 137
    • PDF
    Automatic sense prediction for implicit discourse relations in text
    • 264
    • PDF