• Corpus ID: 61613864

Finite impulse response neural networks with applications in time series prediction

@inproceedings{Wan1994FiniteIR,
  title={Finite impulse response neural networks with applications in time series prediction},
  author={Eric A. Wan},
  year={1994}
}
  • E. Wan
  • Published 2 January 1994
  • Computer Science
Traditional feedforward neural networks are static structures which simply map input to output. Motivated from biological considerations, a dynamic network is proposed which uses Finite Impulse Response (FIR) linear filters to model the processes of axonal transport, synaptic modulation, and membrane charge dissipation. Effectively all weights in the static feedforward network are replaced by adaptive FIR filters. A training algorithm based on gradient descent is derived for the FIR structure… 

Figures from this paper

Adaptive Post-Linearization of Dynamic Nonlinear Systems With Artificial Neural Networks
TLDR
This paper presents a test case involving a sensor with an input-output relation similar to that of a scaled dc SQUID, and discusses the application of the methodology to a variety of nonlinear sensing and amplification systems.
Hardware Implementation of FIR Neural Network for Applications in Time Series Data Prediction
TLDR
Hardware implementation of an FIR neural network for applications in times series data prediction, a variation of a standard neural network called as finite impulse response (FIR) neural network has proven to be highly successful in achieving higher degree of prediction accuracy when used over various time series prediction applications.
A unifying view of some training algorithms for multilayer perceptrons with FIR filter synapses
TLDR
Results are compared for the Mackey-Glass chaotic time series (1977) against a number of other methods including a standard multilayer perceptron, and a local approximation method.
Performance analysis of locally recurrent neural networks
TLDR
Locally recurrent networks are used to simulate the behaviour of the Chua’s circuit, a paradigm for studying chaos, and it is shown that such networks are able to identify the underlying link among the state variables of the Higgs boson circuit.
Neural network learning for time-series predictions using constrained formulations
TLDR
A recurrent FIR neural network is proposed, a constrained formulation for neural network learning is developed, an efficient violation guided backpropagation algorithm is studied for solving the constrained formulation based on the theory of extended saddle points, and neural networklearning is applied for predicting both noise-free time series and noisy time series.
Sufficient Conditions for Error Backflow Convergence in Dynamical Recurrent Neural Networks
  • A. Aussem
  • Computer Science
    Neural Computation
  • 2002
TLDR
Using elementary matrix manipulations, an upper bound on the norm of the weight matrix is provided, ensuring that the gradient vector, when propagated in a reverse manner in time through the error-propagation network, decays exponentially to zero.
Responses of central auditory neurons modeled with finite impulse response (FIR) neural networks
Recurrent and non-recurrent dynamic network paradigms: a case study
  • Walter Maydl, B. Sick
  • Computer Science
    Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium
  • 2000
TLDR
This paper demonstrates by means of an application example (tool wear monitoring in turning) that in these cases dynamic, non-recurrent paradigms like Time Delay neural networks should be preferred and shows that the results of a wear estimation can be improved significantly.
Exact Hessian calculation in feedforward FIR neural networks
  • T. J. Cholewo, J. Zurada
  • Computer Science
    1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227)
  • 1998
TLDR
The second order temporal backpropagation algorithm which enables the exact calculation of the second order error derivatives for a FIR neural network is introduced.
Sequential network construction for time series prediction
  • T. J. Cholewo, J. Zurada
  • Computer Science
    Proceedings of International Conference on Neural Networks (ICNN'97)
  • 1997
TLDR
An application of the sequential network construction (SNC) method to select the size of several popular neural network predictor architectures for various benchmark training sets shows that the best predictions for the Wolfer data are computed using a FIR neural network while for Mackey-Glass data an Elman network yields superior results.
...
...