Highly Influenced

# Learning the Initial State of a Second-Order Recurrent Neural Network during Regular-Language Inference

@article{Forcada1995LearningTI, title={Learning the Initial State of a Second-Order Recurrent Neural Network during Regular-Language Inference}, author={Mikel L. Forcada and Rafael C. Carrasco}, journal={Neural Computation}, year={1995}, volume={7}, pages={923-930} }

- Published 1995 in Neural Computation
DOI:10.1162/neco.1995.7.5.923

Recent work has shown that second-order recurrent neural networks (2ORNNs) may be used to infer regular languages. This paper presents a modified version of the real-time recurrent learning (RTRL) algorithm used to train 2ORNNs, that learns the initial state in addition to the weights. The results of this modification, which adds extra flexibility at a negligible cost in time complexity, suggest that it may be used to improve the learning of regular languages when the size of the network is… CONTINUE READING

#### From This Paper

##### Figures, tables, and topics from this paper.

34 Citations

6 References

Similar Papers

#### References

##### Publications referenced by this paper.

Showing 1-6 of 6 references

Highly Influential

Highly Influential