Weight Initialization for Simultaneous Recurrent Neural Network Trained with a Fixed-point Learning Algorithm

@article{Serpen2003WeightIF,
  title={Weight Initialization for Simultaneous Recurrent Neural Network Trained with a Fixed-point Learning Algorithm},
  author={G{\"u}rsel Serpen and Yifeng Xu},
  journal={Neural Processing Letters},
  year={2003},
  volume={17},
  pages={33-41}
}
This letter presents a study of the Simultaneous Recurrent Neural network, an adaptive algorithm, as a nonlinear dynamic system for static optimization. Empirical findings, which were recently reported in the literature, suggest that the Simultaneous Recurrent Neural network offers superior performance for large-scale instances of combinatorial optimization problems in terms of desirable convergence characteristics improved solution quality and computational complexity measures. A theoretical… CONTINUE READING

References

Publications referenced by this paper.
Showing 1-10 of 13 references

Generalized Maze Navigation: SRN Critic Solve What Feedforward or Hebbian Nets Cannot

P. J. Werbos, X. Pang
Proceedings of World Congress on Neural Networks, • 1996
View 4 Excerpts
Highly Influenced

Learning on a General Network

NIPS • 1987
View 3 Excerpts
Highly Influenced

Addressing the Scaling Problem of Neural Networks in Static Optimization, International

G. Serpen, A. Patwardhan, J. Geib
Journal of Neural Systems, • 2001
View 3 Excerpts

Training Simultaneous Recurrent Neural Network with Resilient Propagation for Combinatorial Optimization

G. Serpen, J. Corra
Journal manuscript in review, • 2001
View 3 Excerpts

Neural Networks for Combinatorial Optimization: A Review of More Than A Decade of Research

K. Smith
INFORMS Journal on Computing, • 1999
View 2 Excerpts

A learning rule for asynchronous perceptrons with feedback in a combinatorial environment

L. B. Almeida
Proceeding of IEEE 1st International Conference on Neural Networks, • 1987
View 2 Excerpts

Similar Papers

Loading similar papers…