# Closure Properties for Private Classification and Online Prediction

@inproceedings{Alon2020ClosurePF, title={Closure Properties for Private Classification and Online Prediction}, author={Noga Alon and Amos Beimel and Shay Moran and Uri Stemmer}, booktitle={COLT}, year={2020} }

Let~$\cH$ be a class of boolean functions and consider a {\it composed class} $\cH'$ that is derived from~$\cH$ using some arbitrary aggregation rule (for example, $\cH'$ may be the class of all 3-wise majority-votes of functions in $\cH$). We upper bound the Littlestone dimension of~$\cH'$ in terms of that of~$\cH$. As a corollary, we derive closure properties for online learning and private PAC learning.
The derived bounds on the Littlestone dimension exhibit an undesirable exponential…

## Figures from this paper

## 11 Citations

### Near-tight closure bounds for Littlestone and threshold dimensions

- Computer Science, MathematicsArXiv
- 2020

The upper bounds give an exponential (in $k$) improvement upon analogous bounds shown by Alon et al. (COLT 2020), thus answering a question posed by their work.

### Private and Online Learnability Are Equivalent

- Computer ScienceJournal of the ACM
- 2022

This work proves that H can be PAC learned by an (approximate) differentially private algorithm if and only if it has a finite Littlestone dimension, implying a qualitative equivalence between online learnability and private PAC learnability.

### Learning Privately with Labeled and Unlabeled Examples

- Computer ScienceAlgorithmica
- 2020

An alternative approach is suggested, inspired by the (non-private) models of semi-supervised learning and active-learning, where the focus is on the sample complexity of labeled examples whereas unlabeled examples are of a significantly lower cost.

### Realizable Learning is All You Need

- Computer ScienceCOLT
- 2022

This work gives the first model-independent framework explaining the equivalence of realizable and agnostic learnability: a three-line blackbox reduction that simplifies, unifies, and extends the authors' understanding across a wide variety of settings.

### Agnostic Online Learning and Excellent Sets

- MathematicsArXiv
- 2021

A key idea from the interaction of model theory and combinatorics is revisited, the existence of large “indivisible” sets, called “ǫ-excellent,” in k-edge stable graphs (equivalently, Littlestone classes), and a quite different existence proof is found, using regret bounds in online learning.

### On the Sample Complexity of Privately Learning Axis-Aligned Rectangles

- Computer ScienceNeurIPS
- 2021

This work revisits the fundamental problem of learning Axis-Aligned-Rectangles over a grid X d ⊆ R d with differential privacy, and presents a novel algorithm that reduces the sample complexity to only O d · (log ∗ | X | ) 1 .

### Private learning implies quantum stability

- Computer ScienceNeurIPS
- 2021

Summary of results for learning real-valued concept classes and quantum states with imprecise feedback and a technique used to prove that arrow for Boolean functions is a no-go for quantum learning setting.

### Online Learning with Simple Predictors and a Combinatorial Characterization of Minimax in 0/1 Games

- Computer ScienceCOLT
- 2021

This work provides nearly tight bounds on the optimal mistake bounds for online learning C using predictors from H given any concept class C and any hypothesis class H, and shows that if the payoff matrix does not contain triangular submatrices of unbounded sizes then the Minimax Theorem is satisfied.

### Sample-efficient proper PAC learning with approximate differential privacy

- Mathematics, Computer ScienceSTOC
- 2021

It is proved that the sample complexity of properly learning a class of Littlestone dimension d with approximate differential privacy is Õ(d6), ignoring privacy and accuracy parameters, and implies that a class is sanitizable if and only if it has finite LittLestone dimension.

### On the Equivalence between Online and Private Learnability beyond Binary Classification

- Computer ScienceNeurIPS
- 2020

This work shows that while online learnability continues to imply private learnability in multi-class classification, current proof techniques encounter significant hurdles in the regression setting, and provides non-trivial sufficient conditions for an online learnable class to also be privately learnable.

## References

SHOWING 1-10 OF 44 REFERENCES

### Agnostic Online Learning

- Computer ScienceCOLT
- 2009

This work describes several models of non-realizable data and derive upper and lower bounds on the achievable regret, and extends Littlestone's theory to include margin-based hypothesis classes, in which the prediction of each hypothesis is accompanied by a confidence value.

### A theory of the learnable

- Computer ScienceSTOC '84
- 1984

This paper regards learning as the phenomenon of knowledge acquisition in the absence of explicit programming, and gives a precise methodology for studying this phenomenon from a computational viewpoint.

### What Can We Learn Privately?

- Computer Science2008 49th Annual IEEE Symposium on Foundations of Computer Science
- 2008

This work investigates learning algorithms that satisfy differential privacy, a notion that provides strong confidentiality guarantees in the contexts where aggregate information is released about a database containing sensitive information about individuals.

### Bounds on the sample complexity for private learning and private data release

- Computer ScienceMachine Learning
- 2013

This work examines several private learning tasks and gives tight bounds on their sample complexity, and shows strong separations between sample complexities of proper and improper private learners (such separation does not exist for non-private learners), and between sample complexity of efficient and inefficient proper private learners.

### Learning Privately with Labeled and Unlabeled Examples

- Computer ScienceAlgorithmica
- 2020

An alternative approach is suggested, inspired by the (non-private) models of semi-supervised learning and active-learning, where the focus is on the sample complexity of labeled examples whereas unlabeled examples are of a significantly lower cost.

### A Shorter Model Theory

- Mathematics
- 1997

The first order case: compactness 6. The countable case 7. The existential case 8. Structure and categoricity.

### Generalization for Adaptively-chosen Estimators via Stable Median

- Computer ScienceCOLT
- 2017

An algorithm that estimates the expectations of arbitrary adaptively-chosen real-valued estimators using a number of samples that scales as $\sqrt{k}$, which is essentially as accurate as if fresh samples were used to evaluate each estimator.

### The reusable holdout: Preserving validity in adaptive data analysis

- Computer ScienceScience
- 2015

A new approach for addressing the challenges of adaptivity based on insights from privacy-preserving data analysis is demonstrated, and how to safely reuse a holdout data set many times to validate the results of adaptively chosen analyses is shown.