T REE-STRUCTURED DECODING WITH DOUBLY-RECURRENT NEURAL NETWORKS
@inproceedings{AlvarezMelis2017TRD, title={T REE-STRUCTURED DECODING WITH DOUBLY-RECURRENT NEURAL NETWORKS}, author={David Alvarez-Melis and T. Jaakkola}, year={2017} }
We propose a neural network architecture for generating tree-structured objects from encoded representations. The core of the method is a doubly recurrent neural network model comprised of separate width and depth recurrences that are combined inside each cell (node) to generate an output. The topology of the tree is modeled explicitly together with the content. That is, in response to an encoded vector representation, co-evolving recurrences are used to realize the associated tree and the… CONTINUE READING
Figures and Tables from this paper
2 Citations
Ain't Nobody Got Time For Coding: Structure-Aware Program Synthesis From Natural Language
- Computer Science, Mathematics
- ArXiv
- 2018
- 6
- Highly Influenced
- PDF
References
SHOWING 1-10 OF 23 REFERENCES
Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
- Computer Science
- ACL
- 2015
- 2,027
- PDF
Learning task-dependent distributed representations by backpropagation through structure
- Computer Science
- Proceedings of International Conference on Neural Networks (ICNN'96)
- 1996
- 505
- PDF
Parsing Natural Scenes and Natural Language with Recursive Neural Networks
- Computer Science
- ICML
- 2011
- 1,193
- PDF
Easy-First Dependency Parsing with Hierarchical Tree LSTMs
- Computer Science
- Transactions of the Association for Computational Linguistics
- 2016
- 57
- Highly Influential
- PDF