Partial differential equation regularization for supervised machine learning

@article{Oberman2020PartialDE,
  title={Partial differential equation regularization for supervised machine learning},
  author={Adam M. Oberman},
  journal={ArXiv},
  year={2020},
  volume={abs/1910.01612}
}
This article is an overview of supervised machine learning problems for regression and classification. Topics include: kernel methods, training by stochastic gradient descent, deep learning architecture, losses for classification, statistical learning theory, and dimension independent generalization bounds. Implicit regularization in deep learning examples are presented, including data augmentation, adversarial training, and additive noise. These methods are reframed as explicit gradient… Expand

References

SHOWING 1-10 OF 47 REFERENCES
Optimization Methods for Large-Scale Machine Learning
  • 1,281
  • PDF
Stability and Generalization
  • 1,264
  • PDF
From Regularization Operators to Support Vector Kernels
  • 91
  • PDF
Regularization Theory and Neural Networks Architectures
  • 1,313
Training with Noise is Equivalent to Tikhonov Regularization
  • 844
  • PDF
Train faster, generalize better: Stability of stochastic gradient descent
  • 582
  • PDF
...
1
2
3
4
5
...