Covering Numbers for Support Vector Machines

@inproceedings{Guo1999CoveringNF,
  title={Covering Numbers for Support Vector Machines},
  author={Ying Guo and Peter L. Bartlett and John Shawe-Taylor and Robert C. Williamson},
  booktitle={COLT},
  year={1999}
}
Support vector (SV) machines are linear classifiers that use the maximum margin hyperplane in a feature space defined by a kernel function. Until recently, the only bounds on the generalization performance of SV machines (within Valiant's probably approximately correct framework) took no account of the kernel used except in its effect on the margin and radius. More recently, it has been shown that one can bound the relevant covering numbers using tools from functional analysis. In this paper… CONTINUE READING

Similar Papers

Citations

Publications citing this paper.

References

Publications referenced by this paper.
SHOWING 1-2 OF 2 REFERENCES

Network Learning: Theoretical Foundations

M. Anthony, P. Bartlett, Neural
  • 1999
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

and B

R. C. Williamson, A. J. Smola
  • Schölkopf, “Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators,” Royal Holloway College, London, U.K., NeuroCOLT Tech. Rep. NC-TR-98-019
  • 1998
VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL