Corpus ID: 15848979

On End-to-End Program Generation from User Intention by Deep Neural Networks

@article{Mou2015OnEP,
  title={On End-to-End Program Generation from User Intention by Deep Neural Networks},
  author={Lili Mou and Rui Men and G. Li and L. Zhang and Zhi Jin},
  journal={ArXiv},
  year={2015},
  volume={abs/1510.07211}
}
  • Lili Mou, Rui Men, +2 authors Zhi Jin
  • Published 2015
  • Computer Science
  • ArXiv
  • This paper envisions an end-to-end program generation scenario using recurrent neural networks (RNNs): Users can express their intention in natural language; an RNN then automatically generates corresponding code in a characterby-by-character fashion. We demonstrate its feasibility through a case study and empirical analysis. To fully make such technique useful in practice, we also point out several cross-disciplinary challenges, including modeling user intention, providing datasets, improving… CONTINUE READING
    Latent Predictor Networks for Code Generation
    • 194
    • PDF
    CoCoNuT: combining context-aware neural translation models using ensemble for program repair
    • 2
    • PDF
    CodeMend: Assisting Interactive Programming with Bimodal Embedding
    • 12
    • PDF
    Deep Code Comment Generation
    • 115
    • PDF
    Dataset for a Neural Natural Language Interface for Databases (NNLIDB)
    • 9
    • PDF
    Language to Code with Open Source Software
    Generative API usage code recommendation with parameter concretization
    • 2
    • PDF
    DeepClone: Modeling Clones to Generate Code Predictions
    • 2
    • PDF

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 27 REFERENCES
    Learning to Execute
    • 380
    • PDF
    Visualizing and Understanding Recurrent Networks
    • 726
    • PDF
    Sequence to Sequence Learning with Neural Networks
    • 10,545
    • PDF
    Building Program Vector Representations for Deep Learning
    • 87
    • PDF
    TBCNN: A Tree-Based Convolutional Neural Network for Programming Language Processing
    • 42
    • PDF
    Dimensions in program synthesis
    • 192
    • PDF
    Attention-Based Models for Speech Recognition
    • 1,268
    • PDF
    Show, Attend and Tell: Neural Image Caption Generation with Visual Attention
    • 5,147
    • PDF