Corpus ID: 15848979

On End-to-End Program Generation from User Intention by Deep Neural Networks

@article{Mou2015OnEP,
  title={On End-to-End Program Generation from User Intention by Deep Neural Networks},
  author={Lili Mou and Rui Men and G. Li and L. Zhang and Zhi Jin},
  journal={ArXiv},
  year={2015},
  volume={abs/1510.07211}
}
  • Lili Mou, Rui Men, +2 authors Zhi Jin
  • Published 2015
  • Computer Science
  • ArXiv
  • This paper envisions an end-to-end program generation scenario using recurrent neural networks (RNNs): Users can express their intention in natural language; an RNN then automatically generates corresponding code in a characterby-by-character fashion. We demonstrate its feasibility through a case study and empirical analysis. To fully make such technique useful in practice, we also point out several cross-disciplinary challenges, including modeling user intention, providing datasets, improving… CONTINUE READING
    26 Citations
    Latent Predictor Networks for Code Generation
    • 202
    • PDF
    CodeMend: Assisting Interactive Programming with Bimodal Embedding
    • 12
    • PDF
    CoCoNuT: combining context-aware neural translation models using ensemble for program repair
    • 2
    • PDF
    Deep Code Comment Generation
    • 126
    • PDF
    Dataset for a Neural Natural Language Interface for Databases (NNLIDB)
    • 9
    • PDF
    Generative API usage code recommendation with parameter concretization
    • 2
    • PDF
    Language to Code with Open Source Software
    • L. Tang, X. Mao, Z. Zhang
    • Computer Science
    • 2019 IEEE 10th International Conference on Software Engineering and Service Science (ICSESS)
    • 2019
    A Syntax-Guided Neural Model for Natural Language Interfaces to Databases
    Hierarchical Embedding for Code Search in Software Q&A Sites
    • R. Li, Gang Hu, Min Peng
    • Computer Science
    • 2020 International Joint Conference on Neural Networks (IJCNN)
    • 2020

    References

    SHOWING 1-10 OF 27 REFERENCES
    Visualizing and Understanding Recurrent Networks
    • 744
    • PDF
    Learning to Execute
    • 387
    • PDF
    Sequence to Sequence Learning with Neural Networks
    • 10,945
    • PDF
    Building Program Vector Representations for Deep Learning
    • 88
    • PDF
    Ask Me Anything: Dynamic Memory Networks for Natural Language Processing
    • 824
    • PDF
    TBCNN: A Tree-Based Convolutional Neural Network for Programming Language Processing
    • 44
    • PDF
    Dimensions in program synthesis
    • Sumit Gulwani
    • Computer Science
    • Formal Methods in Computer Aided Design
    • 2010
    • 196
    • PDF
    Attention-Based Models for Speech Recognition
    • 1,318
    • PDF
    Show, Attend and Tell: Neural Image Caption Generation with Visual Attention
    • 5,308
    • PDF