# Time series prediction by using a connectionist network with internal delay lines

@inproceedings{Wan1993TimeSP, title={Time series prediction by using a connectionist network with internal delay lines}, author={Eric A. Wan}, year={1993} }

A neural network architecture, which models synapses as Finite Impulse Response (FIR) linear lters, is discussed for use in time series prediction. Analysis and methodology are detailed in the context of the Santa Fe Institute Time Series Prediction Competition. Results of the competition show that the FIR network performed remarkably well on a chaotic laser intensity time series.

No Paper Link Available

## Topics from this paper

## 161 Citations

Modeling Nonlinear Dynamics with Neural Networks: Examples in Time Series Prediction

- Computer Science
- 1993

Time series analysis using RBF networks with FIR/IIR synapses

- Computer ScienceNeurocomputing
- 1998

Radial basis functions networks (RBF) with dynamic synapses with novelty aspect consists in replacing the standard scalar values of the output weights by discrete-time FIR/IIR filters.

Efficient Hybrid Neural Network for Chaotic Time Series Prediction

- Computer ScienceICANN
- 2001

The proposed hybrid neural network is constructed by a traditional feed-forward network, which is learned by using the backpropagation and a local models, which are implemented as a time delay embedding.

Model identification of time-delay nonlinear system with FIR neural network

- Computer ScienceProceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.03EX693)
- 2003

The FIR neural network model and its temporal backpropagation algorithm are introduced in this paper and the results show its good characteristics.

Time series forecasting using multilayer neural network constructed by a Monte-Carlo based algorithm

- Computer Science2009 1st IEEE Symposium on Web Society
- 2009

A multilayer neural network constructed by a Monte Carlo based algorithm to forecast time series events with high level of generalization ability is obtained without sensible choice of external parameters.

A modified FIR network for time series prediction

- Computer ScienceProceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.
- 2002

A modified FIR (Finite Impulse Response) network model for improving the capability of time series prediction system and can avoid the over-training effect that is caused by unbalanced learning data is presented.

Combining Singular-Spectrum Analysis and neural networks for time series forecasting

- Computer ScienceNeural Processing Letters
- 2005

The results show that the combined technique has better performances than those offered by the same network directly applied to raw data, and therefore is well suited to forecast short and noisy time series with an underlying deterministic data generating process (DGP).

Wavelet Multi-Layer Perceptron Neural Network for Time-Series Prediction

- 2002

In this paper, we investigate the effectiveness of wavelet Multi-Layer Perceptrons (MLP) neural network for temporal sequence prediction. It is essentially a neural network with input signal…

Time-series data prediction based on reconstruction of missing samples and selective ensembling of FIR neural networks

- Computer ScienceProceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.
- 2002

This paper considers the problem of time-series forecasting by a selective ensemble neural network when the input data are incomplete and shows that the prediction made by the proposed method is more accurate than those predicted by neural networks without a fill-in process or by a single fill- in process.

Learning long-term dependencies by the selective addition of time-delayed connections to recurrent neural networks

- Mathematics, Computer ScienceNeurocomputing
- 2002

This paper supports the view that it is easier for recurrent neural networks to find good solutions if time-delayed connections are included in the recurrent networks, and presents an algorithm that allows one to choose the right locations and delays for such connections.

## References

SHOWING 1-10 OF 54 REFERENCES

Temporal Backpropagation: An Efficient Algorithm for Finite Impulse Response Neural Networks

- Computer Science
- 1991

An efficient gradient descent algorithm is derived which will be shown to be a temporal generalization of the familiar backpropagation algorithm.

Predicting the Future: a Connectionist Approach

- Computer ScienceInt. J. Neural Syst.
- 1990

Since the ultimate goal is accuracy in the prediction, it is found that sigmoid networks trained with the weight-elimination algorithm outperform traditional nonlinear statistical approaches.

Temporal backpropagation for FIR neural networks

- Computer Science1990 IJCNN International Joint Conference on Neural Networks
- 1990

A network structure which models each synapse by a finite-impulse response (FIR) linear filter is proposed and an efficient-gradient descent algorithm which is shown to be a temporal generalization of the familiar backpropagation algorithm is derived.

Nonlinear signal processing using neural networks: Prediction and system modelling

- Computer Science
- 1987

It is demonstrated that the backpropagation learning algorithm for neural networks may be used to predict points in a highly chaotic time series with orders of magnitude increase in accuracy over conventional methods including the Linear Predictive Method and the Gabor-Volterra-Weiner Polynomial Method.

Generalization of backpropagation with application to a recurrent gas market model

- Computer ScienceNeural Networks
- 1988

This paper will derive a generalization of backpropagation to recurrent systems (which input their own output), such as hybrids of perceptron-style networks and Grossberg/Hopfield networks, and does not require the storage of intermediate iterations to deal with continuous recurrence.

A Learning Algorithm for Continually Running Fully Recurrent Neural Networks

- Computer ScienceNeural Computation
- 1989

The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal…

Neural Networks and the Bias/Variance Dilemma

- Computer ScienceNeural Computation
- 1992

It is suggested that current-generation feedforward neural networks are largely inadequate for difficult problems in machine perception and machine learning, regardless of parallel-versus-serial hardware or other implementation issues.

Multilayer feedforward networks are universal approximators

- Mathematics, Computer ScienceNeural Networks
- 1989

Abstract This paper rigorously establishes that standard multilayer feedforward networks with as few as one hidden layer using arbitrary squashing functions are capable of approximating any Borel…

Modular Construction of Time-Delay Neural Networks for Speech Recognition

- Computer ScienceNeural Computation
- 1989

It is shown that small networks trained to perform limited tasks develop time invariant, hidden abstractions that can be exploited to train larger, more complex nets efficiently, and phoneme recognition networks of increasing complexity can be constructed that all achieve superior recognition performance.

Time series analysis - univariate and multivariate methods

- Mathematics, Computer Science
- 1989

This work presents a meta-modelling framework for estimating the modeled properties of the Shannon filter, which automates the very labor-intensive and therefore time-heavy process of Fourier analysis.