Learning the Initial State of a Second-Order Recurrent Neural Network during Regular-Language Inference

@article{Forcada1995LearningTI,
  title={Learning the Initial State of a Second-Order Recurrent Neural Network during Regular-Language Inference},
  author={Mikel L. Forcada and Rafael C. Carrasco},
  journal={Neural Computation},
  year={1995},
  volume={7},
  pages={923-930}
}
Recent work has shown that second-order recurrent neural networks (2ORNNs) may be used to infer regular languages. This paper presents a modified version of the real-time recurrent learning (RTRL) algorithm used to train 2ORNNs, that learns the initial state in addition to the weights. The results of this modification, which adds extra flexibility at a negligible cost in time complexity, suggest that it may be used to improve the learning of regular languages when the size of the network is… CONTINUE READING
34 Citations
6 References
Similar Papers

Similar Papers

Loading similar papers…