PAC-Bayes Risk Bounds for Stochastic Averages and Majority Votes of Sample-Compressed Classifiers

@article{Laviolette2007PACBayesRB,
  title={PAC-Bayes Risk Bounds for Stochastic Averages and Majority Votes of Sample-Compressed Classifiers},
  author={François Laviolette and Mario Marchand},
  journal={Journal of Machine Learning Research},
  year={2007},
  volume={8},
  pages={1461-1487}
}
We propose a PAC-Bayes theorem for the sample-compression setting where each classifier is described by a compression subset of the training data and a message string of additional information. This setting, which is the appropriate one to describe many learning algorithms, strictly generalizes the usual data-independent setting where classifiers are represented only by data-independent message strings (or parameters taken from a continuous set). The proposed PAC-Bayes theorem for the sample… CONTINUE READING

From This Paper

Topics from this paper.

References

Publications referenced by this paper.
Showing 1-10 of 28 references

Olivier Catoni. A PAC-Bayesian approach to adaptive classification

  • Leo Breiman
  • Bagging predictors. Machine Learning,
  • 1996
Highly Influential
9 Excerpts

Similar Papers

Loading similar papers…