Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 218,238,537 papers from all fields of science
Search
Sign In
Create Free Account
Vapnik–Chervonenkis theory
Known as:
VC theory
, Vapnik Chervonenkis theory
, Vapnik-Chervonenkis theory
Vapnik–Chervonenkis theory (also known as VC theory) was developed during 1960–1990 by Vladimir Vapnik and Alexey Chervonenkis. The theory is a form…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
13 relations
Alexey Chervonenkis
Empirical risk minimization
Hypograph (mathematics)
KXEN
Expand
Broader (1)
Computational learning theory
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2011
Highly Cited
2011
Vapnik-Chervonenkis Density in Some Theories without the Independence Property, II
Matthias Aschenbrenner
,
Alfred Dolich
,
Deirdre Haskell
,
D. Macpherson
,
S. Starchenko
Notre Dame J. Formal Log.
2011
Corpus ID: 14491756
We study the Vapnik-Chervonenkis (VC) density of denable families in certain stable rst-order theories. In particular we obtain…
Expand
2008
2008
Vapnik-Chervonenkis theory
R. Kondor
2008
Corpus ID: 131767329
For the purposes of this lecture, we restrict ourselves to the binary supervised batch learning setting. We assume that we have…
Expand
2008
2008
Theories controlled by formulas of Vapnik-Chervonenkis codimension 1
Hans Adler
2008
Corpus ID: 124822565
The notion of a VC-minimal theory is introduced, a slightly more general variant of Cminimality that also includes all strongly…
Expand
2007
2007
Ellipsoidal Kernel Machines
Pannagadatta K. Shivaswamy
2007
Corpus ID: 16895625
A novel technique is proposed for improving the standard Vapnik-Chervonenkis (VC) dimension estimate for the Support Vector…
Expand
2002
2002
Machine Learning with Data Dependent Hypothesis Classes
Adam Cannon
,
J. M. Ettinger
,
D. Hush
,
C. Scovel
Journal of machine learning research
2002
Corpus ID: 13005286
We extend the VC theory of statistical learning to data dependent spaces of classifiers. This theory can be viewed as a…
Expand
Highly Cited
2000
Highly Cited
2000
Learning with recurrent neural networks
B. Hammer
2000
Corpus ID: 15529289
This thesis examines so-called folding neural networks as a mechanism for machine learning. Folding networks form a…
Expand
Highly Cited
1999
Highly Cited
1999
Feedforward Neural Network Methodology
T. Fine
Information Science and Statistics
1999
Corpus ID: 28055334
From the Publisher: This monograph provides a through and coherent introduction to the mathematical properties of feedforward…
Expand
1996
1996
Vapnik-Chervonenkis Theory
Luc Devroye
,
László Györfi
,
Gábor Lugosi
1996
Corpus ID: 118198579
In this chapter we select a decision rule from a class of rules with the help of training data. Working formally, let C be a…
Expand
1994
1994
The Power of Self-Directed Learning
S. Goldman
,
R. Sloan
Machine-mediated learning
1994
Corpus ID: 11781898
AbstractThis article studies self-directed learning, a variant of the on-line (or incremental) learning model in which the…
Expand
Highly Cited
1992
Highly Cited
1992
Vapnik-Chervonenkis classes of definable sets
M. Laskowski
1992
Corpus ID: 15969167
We show that a class of subsets of a structure uniformly definable by a first-order formula is a Vapnik-Chervonenkis class if and…
Expand
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE