• Corpus ID: 12652643

Time series prediction by using a connectionist network with internal delay lines

@inproceedings{Wan1993TimeSP,
  title={Time series prediction by using a connectionist network with internal delay lines},
  author={Eric A. Wan},
  year={1993}
}
  • E. Wan
  • Published 1993
  • Computer Science
A neural network architecture, which models synapses as Finite Impulse Response (FIR) linear lters, is discussed for use in time series prediction. Analysis and methodology are detailed in the context of the Santa Fe Institute Time Series Prediction Competition. Results of the competition show that the FIR network performed remarkably well on a chaotic laser intensity time series. 
Modeling Nonlinear Dynamics with Neural Networks: Examples in Time Series Prediction
Time series analysis using RBF networks with FIR/IIR synapses
  • I. Ciocoiu
  • Computer Science, Engineering
    Neurocomputing
  • 1998
Efficient Hybrid Neural Network for Chaotic Time Series Prediction
TLDR
The proposed hybrid neural network is constructed by a traditional feed-forward network, which is learned by using the backpropagation and a local models, which are implemented as a time delay embedding.
Model identification of time-delay nonlinear system with FIR neural network
  • Li-Feng Wang, Zheng-Xi Li
  • Engineering
    Proceedings of the 2003 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.03EX693)
  • 2003
TLDR
The FIR neural network model and its temporal backpropagation algorithm are introduced in this paper and the results show its good characteristics.
Time series forecasting using multilayer neural network constructed by a Monte-Carlo based algorithm
TLDR
A multilayer neural network constructed by a Monte Carlo based algorithm to forecast time series events with high level of generalization ability is obtained without sensible choice of external parameters.
A modified FIR network for time series prediction
  • H. J. Kim, Won Don Lee, H. Yang
  • Computer Science
    Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02.
  • 2002
TLDR
A modified FIR (Finite Impulse Response) network model for improving the capability of time series prediction system and can avoid the over-training effect that is caused by unbalanced learning data is presented.
On the prediction of the stochastic behavior of time series by use of Neural Networks - performance analysis and results
  • M. Eberspächer
  • Computer Science
    Data Communications and their Performance
  • 1995
TLDR
A procedure is presented that automatically adapts to a given reference source in the sense that a simulated traffic source should show the same stochastic behavior as a reference source.
Wavelet Multi-Layer Perceptron Neural Network for Time-Series Prediction
TLDR
It is shown that wavelet MLP network provides prediction performance comparable to the conventional MLP, and after the less important inputs are eliminated, the waveletMLP shows more consistent performance for different weight initialization in comparison to theventional MLP.
Time-series data prediction based on reconstruction of missing samples and selective ensembling of FIR neural networks
TLDR
This paper considers the problem of time-series forecasting by a selective ensemble neural network when the input data are incomplete and shows that the prediction made by the proposed method is more accurate than those predicted by neural networks without a fill-in process or by a single fill- in process.
...
...

References

SHOWING 1-10 OF 54 REFERENCES
Predicting the Future: a Connectionist Approach
TLDR
Since the ultimate goal is accuracy in the prediction, it is found that sigmoid networks trained with the weight-elimination algorithm outperform traditional nonlinear statistical approaches.
Nonlinear signal processing using neural networks: Prediction and system modelling
TLDR
It is demonstrated that the backpropagation learning algorithm for neural networks may be used to predict points in a highly chaotic time series with orders of magnitude increase in accuracy over conventional methods including the Linear Predictive Method and the Gabor-Volterra-Weiner Polynomial Method.
A Learning Algorithm for Continually Running Fully Recurrent Neural Networks
The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal
Neural Networks and the Bias/Variance Dilemma
TLDR
It is suggested that current-generation feedforward neural networks are largely inadequate for difficult problems in machine perception and machine learning, regardless of parallel-versus-serial hardware or other implementation issues.
Modular Construction of Time-Delay Neural Networks for Speech Recognition
  • A. Waibel
  • Computer Science
    Neural Computation
  • 1989
TLDR
It is shown that small networks trained to perform limited tasks develop time invariant, hidden abstractions that can be exploited to train larger, more complex nets efficiently, and phoneme recognition networks of increasing complexity can be constructed that all achieve superior recognition performance.
Time series analysis - univariate and multivariate methods
TLDR
This work presents a meta-modelling framework for estimating the modeled properties of the Shannon filter, which automates the very labor-intensive and therefore time-heavy process of Fourier analysis.
Phoneme recognition using time-delay neural networks
The authors present a time-delay neural network (TDNN) approach to phoneme recognition which is characterized by two important properties: (1) using a three-layer arrangement of simple computing
...
...