#### Filter Results:

- Full text PDF available (41)

#### Publication Year

2005

2017

- This year (3)
- Last 5 years (20)
- Last 10 years (38)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

- Steve Hanneke
- ICML
- 2007

We study the label complexity of pool-based active learning in the agnostic PAC model. Specifically, we derive general bounds on the number of label requests made by the <i>A</i><sup>2</sup> algorithm proposed by Balcan, Beygelzimer & Langford (Balcan et al., 2006). This represents the first nontrivial general-purpose upper bound on label complexity in… (More)

- Steve Hanneke, Eric P. Xing
- SNA@ICML
- 2006

We propose a family of statistical models for social network evolution over time, which represents an extension of Exponential Random Graph Models (ERGMs). Many of the methods for ERGMs are readily adapted for these models, including MCMC maximum likelihood estimation algorithms. We discuss models of this type and give examples, as well as a demonstration… (More)

- Steve Hanneke
- COLT
- 2009

We study the rates of convergence in classification error achievable by active learning in the presence of label noise. Additionally, we study the more general problem of active learning with a nested hierarchy of hypothesis classes, and propose an algorithm whose error rate provably converges to the best achievable error among classifiers in the hierarchy… (More)

- Maria-Florina Balcan, Steve Hanneke, Jennifer Wortman Vaughan
- Machine Learning
- 2008

We describe and explore a new perspective on the sample complexity of active learning. In many situations where it was generally believed that active learning does not help, we show that active learning does help in the limit, often with exponential improvements in sample complexity. This contrasts with the traditional analysis of active learning problems… (More)

- Fan Guo, Steve Hanneke, Wenjie Fu, Eric P. Xing
- ICML
- 2007

A plausible representation of relational information among entities in dynamic systems such as a living cell or a social community is a stochastic network which is topologically rewiring and semantically evolving over time. While there is a rich literature on modeling static or temporally invariant networks, much less has been done toward modeling the… (More)

- Steve Hanneke
- COLT
- 2007

We study the label complexity of pool-based active learning in the PAC model with noise. Taking inspiration from extant literature on Exact learning with membership queries, we derive upper and lower bounds on the label complexity in terms of generalizations of extended teaching dimension. Among the contributions of this work is the first nontrivial general… (More)

- Steve Hanneke, Avrim Blum, +6 authors Eric P. Xing
- 2009

I study the informational complexity of active learning in a statistical learning theory framework. Specifically, I derive bounds on the rates of convergence achievable by active learning, under various noise models and under general conditions on the hypothesis class. I also study the theoretical advantages of active learning over passive learning, and… (More)

- Steve Hanneke
- Foundations and Trends in Machine Learning
- 2014

Active learning is a protocol for supervised machine learning, in which a learning algorithm sequentially requests the labels of selected data points from a large pool of unlabeled data. This contrasts with passive learning, where the labeled data are taken at random. The objective in active learning is to produce a highly-accurate classifier, ideally using… (More)

- Steve Hanneke
- ICML
- 2006

I consider the setting of transductive learning of vertex labels in graphs, in which a graph with <i>n</i> vertices is sampled according to some unknown distribution; there is a true labeling of the vertices such that each vertex is assigned to exactly one of <i>k</i> classes, but the labels of only some (random) subset of the vertices are revealed to the… (More)

- Steve Hanneke
- Journal of Machine Learning Research
- 2012

We study the theoretical advantages of active learning over passive learning. Specifically, we prove that, in noise-free classifier learning for VC classes, any passive learning algorithm can be transformed into an active learning algorithm with asymptotically strictly superior label complexity for all nontrivial target functions and distributions. We… (More)