Reduced Rank Kernel Ridge Regression

@article{Cawley2002ReducedRK,
  title={Reduced Rank Kernel Ridge Regression},
  author={Gavin C. Cawley and Nicola L. C. Talbot},
  journal={Neural Processing Letters},
  year={2002},
  volume={16},
  pages={293-302}
}
Ridge regression is a classical statistical technique that attempts to address the bias-variance trade-off in the design of linear regression models. A reformulation of ridge regression in dual variables permits a non-linear form of ridge regression via the well-known ‘kernel trick’. Unfortunately, unlike support vector regression models, the resulting kernel expansion is typically fully dense. In this paper, we introduce a reduced rank kernel ridge regression (RRKRR) algorithm, capable of… CONTINUE READING

Citations

Publications citing this paper.
Showing 1-10 of 20 extracted citations

References

Publications referenced by this paper.
Showing 1-10 of 13 references

b’. Technical Report AI Memo 2001-011

  • T. Poggio, S. Mukherjee, R. Rifkin, A. Rakhlin, A. Verri
  • Massachusetts Institute of Technology,
  • 2001
1 Excerpt

Some Aspects of the Spline Smoothing Approach to Non - parametric Regression Curve Fiting

  • B. W. Silverman
  • Journal of the Royal Statistical Society
  • 1985

Some Aspects of the Spline Smoothing Approach to Non-Parametric Regression Curve Fitting

  • B. W. Silverman
  • Journal of the Royal Statistical Society, B
  • 1985
1 Excerpt

Similar Papers

Loading similar papers…