FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling

@article{Back1991FIRAI,
  title={FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling},
  author={Andrew D. Back and Ah Chung Tsoi},
  journal={Neural Computation},
  year={1991},
  volume={3},
  pages={375-385}
}
A new neural network architecture involving either local feedforward global feedforward, and/or local recurrent global feedforward structure is proposed. A learning rule minimizing a mean square error criterion is derived. The performance of this algorithm (local recurrent global feedforward architecture) is compared with a local-feedforward global-feedforward architecture. It is shown that the localrecurrent global-feedforward model performs better than the localfeedforward global-feedforward… CONTINUE READING
Highly Influential
This paper has highly influenced 31 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 125 extracted citations

References

Publications referenced by this paper.
Showing 1-3 of 3 references

Nonlinear signal processing using neural networks: Prediction

  • A. Lapedes, R. Farber
  • 1987
Highly Influential
3 Excerpts

Dynamic error propagation networks

  • J.
  • Ph.D. dissertation, Cambridge University…
  • 1989

Supervised learning and systems with excess degrees of freedom

  • M. I. Jordan
  • COINS Tech. Rep. 88-27, MIT.
  • 1988
1 Excerpt

Similar Papers

Loading similar papers…