Recurrent Neural Networks

@inproceedings{Mikolov2015RecurrentNN,
  title={Recurrent Neural Networks},
  author={Tomas Mikolov and Armand Joulin and Sumit Chopra and Michael Tyler Mathieu and Marc’Aurelio Ranzato},
  year={2015}
}
Recurrent neural network is a powerful model that learns temporal patterns in sequential data. For a long time, it was believed that recurrent networks are difficult to train using simple optimizers, such as stochastic gradient descent, due to the so-called vanishing gradient problem. In this paper, we show that learning longer term patterns in real data, such as in natural language, is perfectly possible using gradient descent. This is achieved by using a slight structural modification of the… CONTINUE READING
4 Citations
29 References
Similar Papers

References

Publications referenced by this paper.

Similar Papers

Loading similar papers…