Corpus ID: 35933354

T REE-STRUCTURED DECODING WITH DOUBLY-RECURRENT NEURAL NETWORKS

@inproceedings{AlvarezMelis2017TRD,
  title={T REE-STRUCTURED DECODING WITH DOUBLY-RECURRENT NEURAL NETWORKS},
  author={David Alvarez-Melis and T. Jaakkola},
  year={2017}
}
We propose a neural network architecture for generating tree-structured objects from encoded representations. The core of the method is a doubly recurrent neural network model comprised of separate width and depth recurrences that are combined inside each cell (node) to generate an output. The topology of the tree is modeled explicitly together with the content. That is, in response to an encoded vector representation, co-evolving recurrences are used to realize the associated tree and the… CONTINUE READING
2 Citations

Figures and Tables from this paper

Ain't Nobody Got Time For Coding: Structure-Aware Program Synthesis From Natural Language
  • 6
  • Highly Influenced
  • PDF
Learning natural language interfaces with neural models
  • 1

References

SHOWING 1-10 OF 23 REFERENCES
Top-down Tree Long Short-Term Memory Networks
  • 83
  • PDF
Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
  • 2,027
  • PDF
Learning task-dependent distributed representations by backpropagation through structure
  • C. Goller, A. Küchler
  • Computer Science
  • Proceedings of International Conference on Neural Networks (ICNN'96)
  • 1996
  • 505
  • PDF
Sequence to Sequence Learning with Neural Networks
  • 11,704
  • PDF
Sequence Level Training with Recurrent Neural Networks
  • 943
  • PDF
Recurrent Neural Network Regularization
  • 1,616
  • PDF
Parsing Natural Scenes and Natural Language with Recursive Neural Networks
  • 1,193
  • PDF
Easy-First Dependency Parsing with Hierarchical Tree LSTMs
  • 57
  • Highly Influential
  • PDF
Language to Logical Form with Neural Attention
  • 431
  • Highly Influential
  • PDF
Long Short-Term Memory
  • 35,887
  • PDF