Finite impulse response neural networks with applications in time series prediction
@inproceedings{Wan1994FiniteIR, title={Finite impulse response neural networks with applications in time series prediction}, author={Eric A. Wan}, year={1994} }
Traditional feedforward neural networks are static structures which simply map input to output. Motivated from biological considerations, a dynamic network is proposed which uses Finite Impulse Response (FIR) linear filters to model the processes of axonal transport, synaptic modulation, and membrane charge dissipation. Effectively all weights in the static feedforward network are replaced by adaptive FIR filters.
A training algorithm based on gradient descent is derived for the FIR structure…
No Paper Link Available
Figures from this paper
123 Citations
Adaptive Post-Linearization of Dynamic Nonlinear Systems With Artificial Neural Networks
- Computer Science
- 1999
This paper presents a test case involving a sensor with an input-output relation similar to that of a scaled dc SQUID, and discusses the application of the methodology to a variety of nonlinear sensing and amplification systems.
Hardware Implementation of FIR Neural Network for Applications in Time Series Data Prediction
- Computer Science
- 2015
Hardware implementation of an FIR neural network for applications in times series data prediction, a variation of a standard neural network called as finite impulse response (FIR) neural network has proven to be highly successful in achieving higher degree of prediction accuracy when used over various time series prediction applications.
A unifying view of some training algorithms for multilayer perceptrons with FIR filter synapses
- Computer ScienceProceedings of IEEE Workshop on Neural Networks for Signal Processing
- 1994
Results are compared for the Mackey-Glass chaotic time series (1977) against a number of other methods including a standard multilayer perceptron, and a local approximation method.
Performance analysis of locally recurrent neural networks
- Computer Science
- 1998
Locally recurrent networks are used to simulate the behaviour of the Chua’s circuit, a paradigm for studying chaos, and it is shown that such networks are able to identify the underlying link among the state variables of the Higgs boson circuit.
Neural network learning for time-series predictions using constrained formulations
- Computer Science
- 2005
A recurrent FIR neural network is proposed, a constrained formulation for neural network learning is developed, an efficient violation guided backpropagation algorithm is studied for solving the constrained formulation based on the theory of extended saddle points, and neural networklearning is applied for predicting both noise-free time series and noisy time series.
Sufficient Conditions for Error Backflow Convergence in Dynamical Recurrent Neural Networks
- Computer ScienceNeural Computation
- 2002
Using elementary matrix manipulations, an upper bound on the norm of the weight matrix is provided, ensuring that the gradient vector, when propagated in a reverse manner in time through the error-propagation network, decays exponentially to zero.
Responses of central auditory neurons modeled with finite impulse response (FIR) neural networks
- Computer ScienceComput. Methods Programs Biomed.
- 2004
Recurrent and non-recurrent dynamic network paradigms: a case study
- Computer ScienceProceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium
- 2000
This paper demonstrates by means of an application example (tool wear monitoring in turning) that in these cases dynamic, non-recurrent paradigms like Time Delay neural networks should be preferred and shows that the results of a wear estimation can be improved significantly.
Exact Hessian calculation in feedforward FIR neural networks
- Computer Science1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227)
- 1998
The second order temporal backpropagation algorithm which enables the exact calculation of the second order error derivatives for a FIR neural network is introduced.
Sequential network construction for time series prediction
- Computer ScienceProceedings of International Conference on Neural Networks (ICNN'97)
- 1997
An application of the sequential network construction (SNC) method to select the size of several popular neural network predictor architectures for various benchmark training sets shows that the best predictions for the Wolfer data are computed using a FIR neural network while for Mackey-Glass data an Elman network yields superior results.