- Published 2017 in COLT

The nonparametric setting in classification allows for a generality which has so far provided remarkable insights on how the interaction between distributional parameters controls learning rates. In particular the interaction between feature X 2 R and label Y 2 {0, 1} can be parametrized into label-noise regimes that clearly interpolate between hard and easy problems. This theory is now well developed for passive learning, i.e., under i.i.d. sampling, however for active learning – where the learner actively chooses informative samples – the theory is still evolving. Our goals in this work are both statistical and algorithmic, the common thrust being to better understand how label-noise regimes control the active setting and induce performance gains over the passive setting. An initial nonparametric result of Castro and Nowak (2008) considers situations where the Bayes decision boundary {x : E[Y |X = x] = 1/2} is given by a smooth curve which bisects the X space. The work yields nontrivial early insights into nonparametric active learning by formalizing a situation where active rates are significantly faster than their passive counterpart. More recently, Minsker (2012a) considered a different nonparametric setting, also of interest here. Namely, rather than assuming a smooth boundary between the classes, the joint distribution of the data P

Showing 1-10 of 42 references

Highly Influential

14 Excerpts

Highly Influential

12 Excerpts

Highly Influential

6 Excerpts

Highly Influential

4 Excerpts

Highly Influential

20 Excerpts

Highly Influential

4 Excerpts

Highly Influential

11 Excerpts

Highly Influential

18 Excerpts

Highly Influential

4 Excerpts

Highly Influential

1 Excerpt

@inproceedings{Andrea2017AdaptivityTN,
title={Adaptivity to Noise Parameters in Nonparametric Active Learning},
author={Carpentier Alexandra Locatelli Andrea and Kpotufe Samory},
booktitle={COLT},
year={2017}
}