Corpus ID: 204915902

Over Parameterized Two-level Neural Networks Can Learn Near Optimal Feature Representations

@article{Fang2019OverPT,
  title={Over Parameterized Two-level Neural Networks Can Learn Near Optimal Feature Representations},
  author={Cong Fang and Hanze Dong and Tong Zhang},
  journal={ArXiv},
  year={2019},
  volume={abs/1910.11508}
}
  • Cong Fang, Hanze Dong, Tong Zhang
  • Published in ArXiv 2019
  • Mathematics, Computer Science
  • Recently, over-parameterized neural networks have been extensively analyzed in the literature. However, the previous studies cannot satisfactorily explain why fully trained neural networks are successful in practice. In this paper, we present a new theoretical framework for analyzing over-parameterized neural networks which we call neural feature repopulation. Our analysis can satisfactorily explain the empirical success of two level neural networks that are trained by standard learning… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Citations

    Publications citing this paper.
    SHOWING 1-3 OF 3 CITATIONS

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 44 REFERENCES

    A mean field view of the landscape of two-layer neural networks

    VIEW 16 EXCERPTS
    HIGHLY INFLUENTIAL

    Neural Tangent Kernel: Convergence and Generalization in Neural Networks

    VIEW 6 EXCERPTS
    HIGHLY INFLUENTIAL

    Gradient Descent Finds Global Minima of Deep Neural Networks

    VIEW 8 EXCERPTS
    HIGHLY INFLUENTIAL