Simple Recurrent Networks Learn Context-Free and Context-Sensitive Languages by Counting

@article{Rodriguez2001SimpleRN,
  title={Simple Recurrent Networks Learn Context-Free and Context-Sensitive Languages by Counting},
  author={Paul E. D. Soto Rodriguez},
  journal={Neural Computation},
  year={2001},
  volume={13},
  pages={2093-2118}
}
It has been shown that if a recurrent neural network (RNN) learns to process a regular language, one can extract a finite-state machine (FSM) by treating regions of phase-space as FSM states. However, it has also been shown that one can construct an RNN to implement Turing machines by using RNN dynamics as counters. But how does a network learn languages that require counting? Rodriguez, Wiles, and Elman (1999) showed that a simple recurrent network (SRN) can learn to process a simple context… CONTINUE READING