VC Dimension of Neural Networks

  title={VC Dimension of Neural Networks},
  author={Eduardo D. Sontag},
This paper presents a brief introduction to Vapnik-Chervonenkis (VC) dimension, a quantity which characterizes the difficulty of distribution-independent learning. The paper establishes various elementary results, and discusses how to estimate the VC dimension in several examples of interest in neural network theory. 
Highly Cited
This paper has 128 citations. REVIEW CITATIONS

From This Paper

Topics from this paper.


Publications citing this paper.
Showing 1-10 of 52 extracted citations

Learning Neural Network Classifiers with Low Model Complexity

ArXiv • 2017
View 8 Excerpts
Method Support
Highly Influenced

Bounds on the number of hidden neurons in three-layer binary neural networks

Neural Networks • 2003
View 7 Excerpts
Method Support
Highly Influenced

Local learning by partitioning

View 3 Excerpts
Highly Influenced

Binary Neural Network Classifier and it's bound for the number of hidden layer neurons

2010 11th International Conference on Control Automation Robotics & Vision • 2010
View 7 Excerpts
Method Support
Highly Influenced

PhysicsGP: A Genetic Programming approach to event selection

Computer Physics Communications • 2005
View 3 Excerpts
Highly Influenced

129 Citations

Citations per Year
Semantic Scholar estimates that this publication has 129 citations based on the available data.

See our FAQ for additional information.