• Corpus ID: 12652643

Time series prediction by using a connectionist network with internal delay lines

@inproceedings{Wan1993TimeSP,
  title={Time series prediction by using a connectionist network with internal delay lines},
  author={Eric A. Wan},
  year={1993}
}
  • E. Wan
  • Published 1993
  • Computer Science
A neural network architecture, which models synapses as Finite Impulse Response (FIR) linear lters, is discussed for use in time series prediction. Analysis and methodology are detailed in the context of the Santa Fe Institute Time Series Prediction Competition. Results of the competition show that the FIR network performed remarkably well on a chaotic laser intensity time series. 
Modeling Nonlinear Dynamics with Neural Networks: Examples in Time Series Prediction
Time series analysis using RBF networks with FIR/IIR synapses
TLDR
Radial basis functions networks (RBF) with dynamic synapses with novelty aspect consists in replacing the standard scalar values of the output weights by discrete-time FIR/IIR filters.
Efficient Hybrid Neural Network for Chaotic Time Series Prediction
TLDR
The proposed hybrid neural network is constructed by a traditional feed-forward network, which is learned by using the backpropagation and a local models, which are implemented as a time delay embedding.
Model identification of time-delay nonlinear system with FIR neural network
  • Li-Feng Wang, Zheng-Xi Li
  • Computer Science
    Proceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.03EX693)
  • 2003
TLDR
The FIR neural network model and its temporal backpropagation algorithm are introduced in this paper and the results show its good characteristics.
Time series forecasting using multilayer neural network constructed by a Monte-Carlo based algorithm
TLDR
A multilayer neural network constructed by a Monte Carlo based algorithm to forecast time series events with high level of generalization ability is obtained without sensible choice of external parameters.
A modified FIR network for time series prediction
  • H. J. Kim, Won Don Lee, H. Yang
  • Computer Science
    Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.
  • 2002
TLDR
A modified FIR (Finite Impulse Response) network model for improving the capability of time series prediction system and can avoid the over-training effect that is caused by unbalanced learning data is presented.
Combining Singular-Spectrum Analysis and neural networks for time series forecasting
TLDR
The results show that the combined technique has better performances than those offered by the same network directly applied to raw data, and therefore is well suited to forecast short and noisy time series with an underlying deterministic data generating process (DGP).
Wavelet Multi-Layer Perceptron Neural Network for Time-Series Prediction
In this paper, we investigate the effectiveness of wavelet Multi-Layer Perceptrons (MLP) neural network for temporal sequence prediction. It is essentially a neural network with input signal
Time-series data prediction based on reconstruction of missing samples and selective ensembling of FIR neural networks
TLDR
This paper considers the problem of time-series forecasting by a selective ensemble neural network when the input data are incomplete and shows that the prediction made by the proposed method is more accurate than those predicted by neural networks without a fill-in process or by a single fill- in process.
Learning long-term dependencies by the selective addition of time-delayed connections to recurrent neural networks
TLDR
This paper supports the view that it is easier for recurrent neural networks to find good solutions if time-delayed connections are included in the recurrent networks, and presents an algorithm that allows one to choose the right locations and delays for such connections.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 54 REFERENCES
Temporal Backpropagation: An Efficient Algorithm for Finite Impulse Response Neural Networks
TLDR
An efficient gradient descent algorithm is derived which will be shown to be a temporal generalization of the familiar backpropagation algorithm.
Predicting the Future: a Connectionist Approach
TLDR
Since the ultimate goal is accuracy in the prediction, it is found that sigmoid networks trained with the weight-elimination algorithm outperform traditional nonlinear statistical approaches.
Temporal backpropagation for FIR neural networks
  • E. Wan
  • Computer Science
    1990 IJCNN International Joint Conference on Neural Networks
  • 1990
TLDR
A network structure which models each synapse by a finite-impulse response (FIR) linear filter is proposed and an efficient-gradient descent algorithm which is shown to be a temporal generalization of the familiar backpropagation algorithm is derived.
Nonlinear signal processing using neural networks: Prediction and system modelling
TLDR
It is demonstrated that the backpropagation learning algorithm for neural networks may be used to predict points in a highly chaotic time series with orders of magnitude increase in accuracy over conventional methods including the Linear Predictive Method and the Gabor-Volterra-Weiner Polynomial Method.
Generalization of backpropagation with application to a recurrent gas market model
  • P. Werbos
  • Computer Science
    Neural Networks
  • 1988
TLDR
This paper will derive a generalization of backpropagation to recurrent systems (which input their own output), such as hybrids of perceptron-style networks and Grossberg/Hopfield networks, and does not require the storage of intermediate iterations to deal with continuous recurrence.
A Learning Algorithm for Continually Running Fully Recurrent Neural Networks
The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal
Neural Networks and the Bias/Variance Dilemma
TLDR
It is suggested that current-generation feedforward neural networks are largely inadequate for difficult problems in machine perception and machine learning, regardless of parallel-versus-serial hardware or other implementation issues.
Multilayer feedforward networks are universal approximators
Abstract This paper rigorously establishes that standard multilayer feedforward networks with as few as one hidden layer using arbitrary squashing functions are capable of approximating any Borel
Modular Construction of Time-Delay Neural Networks for Speech Recognition
  • A. Waibel
  • Computer Science
    Neural Computation
  • 1989
TLDR
It is shown that small networks trained to perform limited tasks develop time invariant, hidden abstractions that can be exploited to train larger, more complex nets efficiently, and phoneme recognition networks of increasing complexity can be constructed that all achieve superior recognition performance.
Time series analysis - univariate and multivariate methods
TLDR
This work presents a meta-modelling framework for estimating the modeled properties of the Shannon filter, which automates the very labor-intensive and therefore time-heavy process of Fourier analysis.
...
1
2
3
4
5
...