Approximation and Estimation Bounds for Artificial Neural Networks

@article{Barron1991ApproximationAE,
  title={Approximation and Estimation Bounds for Artificial Neural Networks},
  author={Andrew R. Barron},
  journal={Machine Learning},
  year={1991},
  volume={14},
  pages={115-133}
}
For a common class of artificial neural networks, the mean integrated squared error between the estimated network and a target function f is shown to be bounded by $${\text{O}}\left( {\frac{{C_f^2 }}{n}} \right) + O(\frac{{ND}}{N}\log N)$$ where n is the number of nodes, d is the input dimension of the function, N is the number of training observations, and C f is the first absolute moment of the Fourier magnitude distribution of f. The two contributions to this total risk are the approximation… CONTINUE READING
Highly Influential
This paper has highly influenced 36 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS

References

Publications referenced by this paper.

Complexity regularization with applications to artificial neural networks

A. R. Barron
In G. Roussas (ed.) Nonparametric Functional Estimation, (pp. 561-576) • 1990
View 8 Excerpts
Highly Influenced

Similar Papers

Loading similar papers…