Shallow Discourse Parsing Using Distributed Argument Representations and Bayesian Optimization
@article{Akanksha2016ShallowDP, title={Shallow Discourse Parsing Using Distributed Argument Representations and Bayesian Optimization}, author={Akanksha and Jacob Eisenstein}, journal={ArXiv}, year={2016}, volume={abs/1606.04503} }
This paper describes the Georgia Tech team's approach to the CoNLL-2016 supplementary evaluation on discourse relation sense classification. We use long short-term memories (LSTM) to induce distributed representations of each argument, and then combine these representations with surface features in a neural network. The architecture of the neural network is determined by Bayesian hyperparameter search.
Figures, Tables, and Topics from this paper
One Citation
A Systematic Study of Neural Discourse Models for Implicit Discourse Relation
- Computer Science
- EACL
- 2017
- 24
- PDF
References
SHOWING 1-10 OF 21 REFERENCES
Recognizing Implicit Discourse Relations in the Penn Discourse Treebank
- Computer Science
- EMNLP
- 2009
- 253
- PDF
When Are Tree Structures Necessary for Deep Learning of Representations?
- Computer Science
- EMNLP
- 2015
- 188
- PDF
Comparing Word Representations for Implicit Discourse Relation Classification
- Computer Science
- EMNLP
- 2015
- 56
- PDF
Automatic sense prediction for implicit discourse relations in text
- Computer Science
- ACL/IJCNLP
- 2009
- 264
- PDF