Generating Text with Recurrent Neural Networks

@inproceedings{Sutskever2011GeneratingTW,
  title={Generating Text with Recurrent Neural Networks},
  author={Ilya Sutskever and James Martens and Geoffrey E. Hinton},
  booktitle={ICML},
  year={2011}
}
Recurrent Neural Networks (RNNs) are very powerful sequence models that do not enjoy widespread use because it is extremely difficult to train them properly. Fortunately, recent advances in Hessian-free optimization have been able to overcome the difficulties associated with training RNNs, making it possible to apply them successfully to challenging sequence problems. In this paper we demonstrate the power of RNNs trained with the new Hessian-Free optimizer (HF) by applying them to character… CONTINUE READING
Highly Influential
This paper has highly influenced 54 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 807 citations. REVIEW CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 516 extracted citations

Neural Joke Generation

View 5 Excerpts
Highly Influenced

Broad Learning for Healthcare

ArXiv • 2018
View 10 Excerpts
Highly Influenced

808 Citations

0100200'12'14'16'18
Citations per Year
Semantic Scholar estimates that this publication has 808 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 26 references

Observable Operator Models for Discrete Stochastic Time Series

Neural Computation • 2000
View 3 Excerpts
Highly Influenced

Gnumpy: an easy way to use GPU boards in Python

T. Tieleman
Technical Report UTML TR 2010-002, • 2010
View 1 Excerpt

Lossless Compression Based on the Sequence Memoizer

2010 Data Compression Conference • 2010
View 2 Excerpts

Cudamat: a CUDA-based matrix class for python

Mnih, Volodymyr
Technical Report UTML TR 2009-004, • 2009

Similar Papers

Loading similar papers…