Corpus ID: 218763248

Can Shallow Neural Networks Beat the Curse of Dimensionality? A mean field training perspective

@article{Wojtowytsch2020CanSN,
  title={Can Shallow Neural Networks Beat the Curse of Dimensionality? A mean field training perspective},
  author={Stephan Wojtowytsch and E Weinan},
  journal={ArXiv},
  year={2020},
  volume={abs/2005.10815}
}
  • Stephan Wojtowytsch, E Weinan
  • Published 2020
  • Mathematics, Computer Science
  • ArXiv
  • We prove that the gradient descent training of a two-layer neural network on empirical or population risk may not decrease population risk at an order faster than $t^{-4/(d-2)}$ under mean field scaling. Thus gradient descent training for fitting reasonably smooth, but truly high-dimensional data may be subject to the curse of dimensionality. We present numerical evidence that gradient descent training with general Lipschitz target functions becomes slower and slower as the dimension increases… CONTINUE READING

    Figures and Topics from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 48 REFERENCES