On the Selection of Weight Decay Parameter for Faulty Networks

@article{Leung2010OnTS,
  title={On the Selection of Weight Decay Parameter for Faulty Networks},
  author={Andrew Chi-Sing Leung and Hongjiang Wang and John Sum},
  journal={IEEE Transactions on Neural Networks},
  year={2010},
  volume={21},
  pages={1232-1244}
}
The weight-decay technique is an effective approach to handle overfitting and weight fault. For fault-free networks, without an appropriate value of decay parameter, the trained network is either overfitted or underfitted. However, many existing results on the selection of decay parameter focus on fault-free networks only. It is well known that the weight-decay method can also suppress the effect of weight fault. For the faulty case, using a test set to select the decay parameter is not… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 14 CITATIONS

RBF Networks Under the Concurrent Fault Situation

  • IEEE Transactions on Neural Networks and Learning Systems
  • 2012
VIEW 7 EXCERPTS
CITES METHODS & BACKGROUND

Objective Function and Learning Algorithm for the General Node Fault Situation

  • IEEE Transactions on Neural Networks and Learning Systems
  • 2016
VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS

Reducing the complexity of an adaptive radial basis function network with a histogram algorithm

  • Neural Computing and Applications
  • 2016
VIEW 6 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Structure Optimization of Neural Networks with L1 Regularization on Gates

  • 2018 IEEE Symposium Series on Computational Intelligence (SSCI)
  • 2018
VIEW 1 EXCERPT

A Regularizer Approach for RBF Networks Under the Concurrent Weight Failure Situation

  • IEEE Transactions on Neural Networks and Learning Systems
  • 2017
VIEW 1 EXCERPT
CITES BACKGROUND

An analog network approach to train RBF networks based on sparse recovery

  • 2014 19th International Conference on Digital Signal Processing
  • 2014
VIEW 2 EXCERPTS
CITES BACKGROUND

Online Training for Open Faulty RBF Networks

  • Neural Processing Letters
  • 2014
VIEW 3 EXCERPTS
CITES METHODS

References

Publications referenced by this paper.
SHOWING 1-10 OF 45 REFERENCES

Sparse modeling using orthogonal forward regression with PRESS statistic and regularization

  • IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics)
  • 2004
VIEW 12 EXCERPTS
HIGHLY INFLUENTIAL

Optimal Brain Damage

VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

A Fault-Tolerant Regularizer for RBF Networks

  • IEEE Transactions on Neural Networks
  • 2008
VIEW 5 EXCERPTS

A Simple Trick for Estimating the Weight Decay Parameter

  • Neural Networks: Tricks of the Trade
  • 2012
VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Regularization parameter estimation for feedforward neural networks

  • IEEE Trans. Systems, Man, and Cybernetics, Part B
  • 2003
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Complete and partial fault tolerance of feedforward neural nets

  • IEEE Trans. Neural Networks
  • 1995
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Note on generalization, regularization and architecture selection in nonlinear learning systems

  • Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop
  • 1991
VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL