# Predicting with Distributions

@article{Kearns2016PredictingWD, title={Predicting with Distributions}, author={Michael Kearns and Zhiwei Steven Wu}, journal={ArXiv}, year={2016}, volume={abs/1606.01275} }

We consider a new learning model in which a joint distribution over vector pairs $(x,y)$ is determined by an unknown function $c(x)$ that maps input vectors $x$ not to individual outputs, but to entire {\em distributions\/} over output vectors $y$. Our main results take the form of rather general reductions from our model to algorithms for PAC learning the function class and the distribution class separately, and show that virtually every such combination yields an efficient algorithm in our…

## 4 Citations

### Improving data quality to build a robust distribution model for Architeuthis dux

- Environmental Science
- 2015

### Using consensus mapping methods as an efficient way of depicting avian distributions in the Caatinga Dry Forest, a poorly known Neotropical biome

- Environmental ScienceOrnithology Research
- 2022

Mapping species distributions has become central for biodiversity research. Different mapping methods, however, may result in dramatically different spatial patterns. We used expert-drawn maps…

### Patterns of livestock depredation by snow leopards and effects of intervention strategies: lessons from the Nepalese Himalaya

- Environmental ScienceWildlife Research
- 2022

ABSTRACT Context. Large carnivores are increasingly threatened by anthropogenic activities, and their protection is among the main goals of biodiversity conservation. The snow leopard (Panthera…

### Cameroon’s adaptation to climate change and sorghum productivity

- EconomicsCogent Social Sciences
- 2022

Abstract Cameroon’s semi-arid zone (northern and far northern regions) is an important part of the local ecosystem that is vulnerable to climate change. This vulnerability raises concerns about rural…

## References

SHOWING 1-10 OF 25 REFERENCES

### Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications

- Mathematics, Computer ScienceInf. Comput.
- 1992

### On the learnability of discrete distributions

- Computer ScienceSTOC '94
- 1994

A new model of learning probability distributions from independent draws is introduced, inspired by the popular Probably Approximately Correct (PAC) model for learning boolean functions from labeled examples, in the sense that it emphasizes efficient and approximate learning, and it studies the learnability of restricted classes of target distributions.

### Efficient noise-tolerant learning from statistical queries

- Computer ScienceSTOC
- 1993

This paper formalizes a new but related model of learning from statistical queries, and demonstrates the generality of the statistical query model, showing that practically every class learnable in Valiant’s model and its variants can also be learned in the new model (and thus can be learning in the presence of noise).

### Learning From Noisy Examples

- Computer ScienceMachine Learning
- 2005

This paper shows that when the teacher may make independent random errors in classifying the example data, the strategy of selecting the most consistent rule for the sample is sufficient, and usually requires a feasibly small number of examples, provided noise affects less than half the examples on average.

### Learning mixtures of arbitrary gaussians

- Computer ScienceSTOC '01
- 2001

This paper presents the first algorithm that provably learns the component gaussians in time that is polynomial in the dimension.

### Learning mixtures of product distributions over discrete domains

- Computer Science, Mathematics46th Annual IEEE Symposium on Foundations of Computer Science (FOCS'05)
- 2005

This work gives a poly(n//spl epsi/) time algorithm for learning a mixture of k arbitrary product distributions over the n-dimensional Boolean cube to accuracy to prove that no polynomial time algorithm can succeed when k is superconstant.

### PAC Learning with Constant-Partition Classification Noise and Applications to Decision Tree Induction

- Computer ScienceICML
- 1997

A new model of noise called constant-partition classification noise (CPCN) is introduced which generalizes the standard model of classification noise to allow different examples to have different rates of random misclassification.

### Combining labeled and unlabeled data with co-training

- Computer ScienceCOLT' 98
- 1998

A PAC-style analysis is provided for a problem setting motivated by the task of learning to classify web pages, in which the description of each example can be partitioned into two distinct views, to allow inexpensive unlabeled data to augment, a much smaller set of labeled examples.

### PAC Learning Axis-Aligned Mixtures of Gaussians with No Separation Assumption

- Computer ScienceCOLT
- 2006

A new vantage point for the learning of mixtures of Gaussians is proposed: namely, the PAC-style model of learning probability distributions introduced by Kearns et al.

### The Strength of Weak Learnability

- Computer ScienceMachine Learning
- 2004

In this paper, a method is described for converting a weak learning algorithm into one that achieves arbitrarily high accuracy, and it is shown that these two notions of learnability are equivalent.