Optimal Differentially Private Learning of Thresholds and Quasi-Concave Optimization
@article{Cohen2022OptimalDP, title={Optimal Differentially Private Learning of Thresholds and Quasi-Concave Optimization}, author={Edith Cohen and Xin Lyu and Jelani Nelson and Tam'as Sarl'os and Uri Stemmer}, journal={Proceedings of the 55th Annual ACM Symposium on Theory of Computing}, year={2022} }
The problem of learning threshold functions is a fundamental one in machine learning. Classical learning theory implies sample complexity of O(ξ−1 log(1/β)) (for generalization error ξ with confidence 1−β). The private version of the problem, however, is more challenging and in particular, the sample complexity must depend on the size |X| of the domain. Progress on quantifying this dependence, via lower and upper bounds, was made in a line of works over the past decade. In this paper, we…
4 Citations
Private Everlasting Prediction
- Computer Science
- 2023
This work introduces private everlasting prediction taking into account the privacy of both the training set and the (adaptively chosen) queries made to the predictor, and presents a generic construction of private everlasting predictors in the PAC model.
Differentially Private Medians and Interior Points for Non-Pathological Data
- MathematicsArXiv
- 2023
We construct differentially private estimators with low sample complexity that estimate the median of an arbitrary distribution over $\mathbb{R}$ satisfying very mild moment conditions. Our result…
The Target-Charging Technique for Privacy Accounting across Interactive Computations
- Computer ScienceArXiv
- 2023
TCT generalizes tools such as the sparse vector technique and top-$k$ selection from private candidates and extends their remarkable privacy enhancement benefits from noisy Lipschitz functions to general private algorithms.
Relaxed Models for Adversarial Streaming: The Advice Model and the Bounded Interruptions Model
- Computer ScienceArXiv
- 2023
This work set out to explore intermediate models that allow us to interpolate between the oblivious and the adversarial models, and put forward the following two models.
30 References
Privately Learning Thresholds: Closing the Exponential Gap
- Computer Science, MathematicsCOLT
- 2020
An improved version of the algorithm constructed for the related interior point problem, based on selecting an input-dependent hash function and using it to embed the database into a domain whose size is reduced logarithmically; this results in a new database which can be used to generate an interior point in the original database in a differentially private manner.
Differentially Private Release and Learning of Threshold Functions
- Computer Science, Mathematics2015 IEEE 56th Annual Symposium on Foundations of Computer Science
- 2015
The first nontrivial lower bound for releasing thresholds with (ε, δ) differential privacy is given, showing that the task is impossible over an infinite domain X, and moreover requires sample complexity n ≥ Ω(log* |X|), which grows with the size of the domain.
Bounds on the sample complexity for private learning and private data release
- Computer ScienceMachine Learning
- 2013
This work examines several private learning tasks and gives tight bounds on their sample complexity, and shows strong separations between sample complexities of proper and improper private learners (such separation does not exist for non-private learners), and between sample complexity of efficient and inefficient proper private learners.
Sample Complexity Bounds for Differentially Private Learning
- Computer ScienceCOLT
- 2011
An upper bound on the sample requirement of learning with label privacy is provided that depends on a measure of closeness between and the unlabeled data distribution and applies to the non-realizable as well as the realizable case.
On the Sample Complexity of Privately Learning Axis-Aligned Rectangles
- Computer ScienceNeurIPS
- 2021
This work revisits the fundamental problem of learning Axis-Aligned-Rectangles over a grid X d ⊆ R d with differential privacy, and presents a novel algorithm that reduces the sample complexity to only O d · (log ∗ | X | ) 1 .
Sample Complexity Bounds on Differentially Private Learning via Communication Complexity
- Computer ScienceSIAM J. Comput.
- 2015
It is shown that the sample complexity of learning with (pure) differential privacy can be arbitrarily higher than the samplecomplexity of learning without the privacy constraint or the sample complex oflearning with approximate differential privacy.
Private PAC learning implies finite Littlestone dimension
- Computer ScienceSTOC
- 2019
We show that every approximately differentially private learning algorithm (possibly improper) for a class H with Littlestone dimension d requires Ω(log*(d)) examples. As a corollary it follows that…
Learning Privately with Labeled and Unlabeled Examples
- Computer ScienceAlgorithmica
- 2020
An alternative approach is suggested, inspired by the (non-private) models of semi-supervised learning and active-learning, where the focus is on the sample complexity of labeled examples whereas unlabeled examples are of a significantly lower cost.
Private Center Points and Learning of Halfspaces
- Computer Science, MathematicsCOLT
- 2019
The construction establishes a relationship between these two problems that is reminiscent of the relation between the median and learning one-dimensional thresholds, which suggests that the problem of privately locating a center point may have further applications in the design of differentially private algorithms.
Private Learning of Halfspaces: Simplifying the Construction and Reducing the Sample Complexity
- Computer Science, MathematicsNeurIPS
- 2020
We present a differentially private learner for halfspaces over a finite grid $G$ in $\mathbb{R}^d$ with sample complexity $\approx d^{2.5}\cdot 2^{\log^*|G|}$, which improves the state-of-the-art…