Corpus ID: 88518913

Estimated VC dimension for risk bounds

  title={Estimated VC dimension for risk bounds},
  author={D. J. McDonald and C. Shalizi and M. Schervish},
  journal={arXiv: Machine Learning},
  • D. J. McDonald, C. Shalizi, M. Schervish
  • Published 2011
  • Mathematics
  • arXiv: Machine Learning
  • Vapnik-Chervonenkis (VC) dimension is a fundamental measure of the generalization capacity of learning algorithms. However, apart from a few special cases, it is hard or impossible to calculate analytically. Vapnik et al. [10] proposed a technique for estimating the VC dimension empirically. While their approach behaves well in simulations, it could not be used to bound the generalization risk of classifiers, because there were no bounds for the estimation error of the VC dimension itself. We… CONTINUE READING
    6 Citations

    Tables from this paper.

    Paper Mentions

    Deriving an optimality criterion for estimating VC dimension
    Model Selection via the VC Dimension
    • 1
    • Highly Influenced
    • PDF
    Structural Return Maximization for Reinforcement Learning
    Time series forecasting: model evaluation and selection using nonparametric risk bounds
    • 6
    • PDF
    Decision making in the presence of complex dynamics from limited, batch data


    Measuring the VC-Dimension Using Optimized Experimental Design
    • 47
    Rademacher and Gaussian Complexities: Risk Bounds and Structural Results
    • 1,709
    • PDF
    Measuring the VC-Dimension of a Learning Machine
    • 316
    • Highly Influential
    • PDF
    Statistical learning theory
    • 25,313
    • Highly Influential
    • PDF
    Convergence of stochastic processes
    • 2,045
    The Nature of Statistical Learning Theory
    • V. N. Vapnik
    • Computer Science, Mathematics
    • Statistics for Engineering and Information Science
    • 2000
    • 36,859
    Empirical Processes in M-Estimation
    • 515
    • Highly Influential
    • PDF
    Estimating a regression function
    • 56
    • Highly Influential
    The necessary and sufficient conditions for consistency of the method of empirical risk
    • Pattern Recognition and Image Analysis, 1(3), 284–305.