# Statistical Guarantees for Regularized Neural Networks

@article{Taheri2020StatisticalGF, title={Statistical Guarantees for Regularized Neural Networks}, author={Mahsa Taheri and Fang Xie and Johannes Lederer}, journal={ArXiv}, year={2020}, volume={abs/2006.00294} }

Neural networks have become standard tools in the analysis of data, but they lack comprehensive mathematical theories. For example, there are very few statistical guarantees for learning neural networks from data, especially for classes of estimators that are used in practice or at least similar to such. In this paper, we develop a general statistical guarantee for estimators that consist of a least-squares term and a regularizer. We then exemplify this guarantee with $\ell_1$-regularization… CONTINUE READING

#### Topics from this paper.

#### References

##### Publications referenced by this paper.

SHOWING 1-10 OF 47 REFERENCES

## High-Dimensional Probability: An Introduction with Applications in Data Science

VIEW 12 EXCERPTS

HIGHLY INFLUENTIAL

## Neural Network Learning - Theoretical Foundations

VIEW 8 EXCERPTS

HIGHLY INFLUENTIAL

## On the prediction performance of the Lasso

VIEW 6 EXCERPTS

## Complexity, Statistical Risk, and Metric Entropy of Deep Nets Using Total Path Variation

VIEW 4 EXCERPTS

HIGHLY INFLUENTIAL

## Empirical Processes in M-Estimation

VIEW 3 EXCERPTS

HIGHLY INFLUENTIAL

## Norm-Based Capacity Control in Neural Networks

VIEW 4 EXCERPTS

HIGHLY INFLUENTIAL

## Path-SGD: Path-Normalized Optimization in Deep Neural Networks

VIEW 4 EXCERPTS

HIGHLY INFLUENTIAL

## Attention-Based Models for Speech Recognition

VIEW 1 EXCERPT