# Multiclass versus Binary Differentially Private PAC Learning

@inproceedings{Bun2021MulticlassVB, title={Multiclass versus Binary Differentially Private PAC Learning}, author={Mark Bun and Marco Gaboardi and Satchit Sivakumar}, booktitle={NeurIPS}, year={2021} }

We show a generic reduction from multiclass differentially private PAC learning to binary private PAC learning. We apply this transformation to a recently proposed binary private PAC learner to obtain a private multiclass learner with sample complexity that has a polynomial dependence on the multiclass Littlestone dimension and a poly-logarithmic dependence on the number of classes. This yields a doubly exponential improvement in the dependence on both parameters over learners from previous…

## One Citation

### Private and Online Learnability Are Equivalent

- Computer ScienceJournal of the ACM
- 2022

This work proves that H can be PAC learned by an (approximate) differentially private algorithm if and only if it has a finite Littlestone dimension, implying a qualitative equivalence between online learnability and private PAC learnability.

## References

SHOWING 1-10 OF 27 REFERENCES

### Private PAC learning implies finite Littlestone dimension

- Computer ScienceSTOC
- 2019

We show that every approximately differentially private learning algorithm (possibly improper) for a class H with Littlestone dimension d requires Ω(log*(d)) examples. As a corollary it follows that…

### Efficient, Noise-Tolerant, and Private Learning via Boosting

- Computer ScienceCOLT 2020
- 2020

A simple framework for designing private boosting algorithms is introduced and is used to construct noise-tolerant and private PAC learners for large-margin halfspaces whose sample complexity does not depend on the dimension.

### An Equivalence Between Private Classification and Online Prediction

- Computer Science, Mathematics2020 IEEE 61st Annual Symposium on Foundations of Computer Science (FOCS)
- 2020

It is proved that every concept class with finite Littlestone dimension can be learned by an (approximate) differentially-private algorithm and a new notion of algorithmic stability called “global stability” is introduced which is essential to the proof and may be of independent interest.

### Characterizing the Sample Complexity of Pure Private Learners

- Computer ScienceJ. Mach. Learn. Res.
- 2019

A combinatorial characterization of the sample size sufficient and necessary to learn a class of concepts under pure differential privacy is given and a similar characterization holds for the database size needed for computing a large class of optimization problems underpure differential privacy, and also for the well studied problem of private data release.

### On the Equivalence between Online and Private Learnability beyond Binary Classification

- Computer ScienceNeurIPS
- 2020

This work shows that while online learnability continues to imply private learnability in multi-class classification, current proof techniques encounter significant hurdles in the regression setting, and provides non-trivial sufficient conditions for an online learnable class to also be privately learnable.

### What Can We Learn Privately?

- Computer Science2008 49th Annual IEEE Symposium on Foundations of Computer Science
- 2008

This work investigates learning algorithms that satisfy differential privacy, a notion that provides strong confidentiality guarantees in the contexts where aggregate information is released about a database containing sensitive information about individuals.

### Privately Learning Thresholds: Closing the Exponential Gap

- Computer Science, MathematicsCOLT
- 2020

An improved version of the algorithm constructed for the related interior point problem, based on selecting an input-dependent hash function and using it to embed the database into a domain whose size is reduced logarithmically; this results in a new database which can be used to generate an interior point in the original database in a differentially private manner.

### Sample Complexity Bounds on Differentially Private Learning via Communication Complexity

- Computer ScienceSIAM J. Comput.
- 2015

It is shown that the sample complexity of learning with (pure) differential privacy can be arbitrarily higher than the samplecomplexity of learning without the privacy constraint or the sample complex oflearning with approximate differential privacy.

### Sample-efficient proper PAC learning with approximate differential privacy

- Mathematics, Computer ScienceSTOC
- 2021

It is proved that the sample complexity of properly learning a class of Littlestone dimension d with approximate differential privacy is Õ(d6), ignoring privacy and accuracy parameters, and implies that a class is sanitizable if and only if it has finite LittLestone dimension.

### Multiclass Learnability and the ERM principle

- Computer ScienceCOLT
- 2011

A principle is proposed for designing good ERM learners, and this principle is used to prove tight bounds on the sample complexity of learning symmetric multiclass hypothesis classes--classes that are invariant under permutations of label names.