Corpus ID: 9994916

BackPropagation Through Time

@inproceedings{Guo2013BackPropagationTT,
  title={BackPropagation Through Time},
  author={Jiang Guo},
  year={2013}
}
This report provides detailed description and necessary derivations for the BackPropagation Through Time (BPTT) algorithm. BPTT is often used to learn recurrent neural networks (RNN). Contrary to feed-forward neural networks, the RNN is characterized by the ability of encoding longer past information, thus very suitable for sequential models. The BPTT extends the ordinary BP algorithm to suit the recurrent neural architecture. 1 Basic Definitions For a two-layer feed-forward neural network, we… Expand

Figures and Tables from this paper

Non-Autoregressive vs Autoregressive Neural Networks for System Identification
TLDR
Comparisons with other state-of-the-art black-box system identification methods show, that the implementation of the non-autoregressive GRU is the best performing neural network-based system identification method, and in the benchmarks without extrapolation, the bestperforming black- box method. Expand
Artificial neural networks for artificial intelligence
Artificial neural networks now have a long history as major techniques in computational intelligence with a wide range of applications for learning from data. There are many methods developed andExpand
Long Short-term Memory RNN
TLDR
This paper introduces an LSTM cell’s architecture and explains how different components go together to alter the cell's memory and predict the output, and provides the necessary formulas and foundations to calculate a forward iteration through an L STM. Expand
On the use of phone-gram units in recurrent neural networks for language identification
TLDR
The phonotactic system performs ~13% better than the unigram-based RNNLM system and is calibrated and fused with other scores f rom an acoustic-based i-vector system and a traditional PP RLM system showing that they provide complementary information t the LID system. Expand
Evolving and Spiking Connectionist Systems for Brain-Inspired Artificial Intelligence
  • N. Kasabov
  • Computer Science
  • Artificial Intelligence in the Age of Neural Networks and Brain Computing
  • 2019
TLDR
This chapter starts with a brief review of AI methods, from Aristotle's logic to the classical artificial neural networks (ANN) and hybrid systems that are used for AI now, and concludes that knowing and combining various methods of AI and ANN and getting more inspiration from neuroscience to create new methods is the way for future research. Expand
Suitable Recurrent Neural Network for Air Quality Prediction With Backpropagation Through Time
TLDR
The BPTT algorithm is applied by comparing Elman RNN, Jordan RNN and hybrid network architecture to predict the time series data of air pollutant concentration in determining air quality to determine future air quality conditions that are good or bad for health and the environment. Expand
Rapid phase-resolved prediction of nonlinear dispersive waves using machine learning
In this paper, we show that a revised convolutional recurrent neural network (CRNN) can decrease, by orders of magnitude, the time needed for the phase-resolved prediction of waves in aExpand
MODIFICATION AND PARALLELIZATION OF GENETIC ALGORITHM FOR SYNTHESIS OF ARTIFICIAL NEURAL NETWORKS
Context. The problem of automation synthesis of artificial neural networks for further use in diagnosing, forecasting and pattern recognition is solved. The object of the study was the process ofExpand
Term Extraction as Sequence Labeling Task using Recurrent Neural Networks
Terminology extraction is mostly performed in two steps. In the first step, the text is filtered in order to identify word groups, which fit predefined syntactic patterns. In the second step, theseExpand
The implementation of a Deep Recurrent Neural Network Language Model on a Xilinx FPGA
TLDR
It is found that the DRNN language model can be deployed on the embedded system smoothly and the Overlay accelerator with AXI Stream interface performs at 20 GOPS processing throughput, which constitutes a 70.5X and 2.75X speed up compared to the work in Ref.30 and Ref.31 respectively. Expand
...
1
2
3
...

References

SHOWING 1-2 OF 2 REFERENCES
Recurrent neural network based language model
TLDR
Results indicate that it is possible to obtain around 50% reduction of perplexity by using mixture of several RNN LMs, compared to a state of the art backoff language model. Expand
Learning task-dependent distributed representations by backpropagation through structure
  • C. Goller, A. Küchler
  • Computer Science
  • Proceedings of International Conference on Neural Networks (ICNN'96)
  • 1996
TLDR
A connectionist architecture together with a novel supervised learning scheme which is capable of solving inductive inference tasks on complex symbolic structures of arbitrary size is presented. Expand