# The Directional Bias Helps Stochastic Gradient Descent to Generalize in Kernel Regression Models

@article{Luo2022TheDB,
title={The Directional Bias Helps Stochastic Gradient Descent to Generalize in Kernel Regression Models},
author={Yiling Luo and Xiaoming Huo and Yajun Mei},
journal={2022 IEEE International Symposium on Information Theory (ISIT)},
year={2022},
pages={678-683}
}
• Published 29 April 2022
• Computer Science
• 2022 IEEE International Symposium on Information Theory (ISIT)
We study the Stochastic Gradient Descent (SGD) algorithm in nonparametric statistics: kernel regression in particular. The directional bias property of SGD, which is known in the linear regression setting, is generalized to the kernel regression. More specifically, we prove that SGD with moderate and annealing step-size converges along the direction of the eigenvector that corresponds to the largest eigenvalue of the Gram matrix. In addition, the Gradient Descent (GD) with a moderate or small…

## References

SHOWING 1-10 OF 33 REFERENCES

### The Implicit Regularization of Stochastic Gradient Flow for Least Squares

• Mathematics, Computer Science
ICML
• 2020