# Effect of Batch Learning in Multilayer Neural Networks

@inproceedings{Fukumizu1998EffectOB, title={Effect of Batch Learning in Multilayer Neural Networks}, author={K. Fukumizu}, booktitle={ICONIP}, year={1998} }

This paper discusses batch gradient descent learning in mul-tilayer networks with a large number of statistical training data. We emphasize on the diierence between regular cases, where the prepared model has the same size as the true function , and overrealizable cases, where the model has surplus hidden units to realize the true function. First, experimental study on multilayer perceptrons and linear neural networks (LNN) shows that batch learning induces strong overtrain-ing on both models… CONTINUE READING

#### Topics from this paper.

11 Citations

High-dimensional dynamics of generalization error in neural networks

- Medicine, Computer Science
- 2020

- 144
- PDF

Minnorm training: an algorithm for training over-parameterized deep neural networks

- Mathematics, Computer Science
- 2018

- 10
- PDF

Supplementary Information : A mathematical theory of semantic development in deep neural networks

- 2019

Exact solutions to the nonlinear dynamics of learning in deep linear neural networks

- Computer Science, Physics
- 2014

- 918
- PDF

On the Optimization of Deep Networks: Implicit Acceleration by Overparameterization

- Computer Science, Mathematics
- 2018

- 148
- Highly Influenced
- PDF

#### References

##### Publications referenced by this paper.

SHOWING 1-7 OF 7 REFERENCES

A Regularity Condition of the Information Matrix of a Multilayer Perceptron Network

- Mathematics, Medicine
- 1996

- 75

Universal approximation bounds for superpositions of a sigmoidal function

- Mathematics, Computer Science
- 1993

- 2,249
- Highly Influential
- PDF

Statistical the- ory of overtraining { is cross-validation asymptotically e ective?," Advances in Neural Information Processing Systems 8, pp.176{182

- 1996