• Corpus ID: 14586249

Privacy for the Protected (Only)

@article{Kearns2015PrivacyFT,
  title={Privacy for the Protected (Only)},
  author={Michael Kearns and Aaron Roth and Zhiwei Steven Wu and Grigory Yaroslavtsev},
  journal={ArXiv},
  year={2015},
  volume={abs/1506.00242}
}
Motivated by tensions between data privacy for individual citizens, and societal priorities such as counterterrorism and the containment of infectious disease, we introduce a computational model that distinguishes between parties for whom privacy is explicitly protected, and those for whom it is not (the targeted subpopulation). The goal is the development of algorithms that can effectively identify and take action upon members of the targeted subpopulation in a way that minimally compromises… 

Figures and Tables from this paper

Multi-owner multi-user privacy
TLDR
This paper presents an approach for a multi-owner multi-user (MOMU) system where data owners require privacy guarantees before offering their private data, and considers a Gaussian mechanism, derive the constraints on the covariance matrix, and propose a convex semi-definite relaxation to design the covariances.
Chasing Your Long Tails: Differentially Private Prediction in Health Care Settings
TLDR
This paper uses state-of-the-art methods for DP learning to train privacy-preserving models in clinical prediction tasks, including x-ray classification of images and mortality prediction in time series data, and uses these models to perform a comprehensive empirical investigation of the tradeoffs between privacy, utility, robustness to dataset shift and fairness.
Differentially Private Link Prediction with Protected Connections
TLDR
A form of differential privacy on graphs is proposed, which models the privacy loss only of those node-pairs which are marked as protected, and DPLP, a learning to rank algorithm, which applies a monotone transform to base scores from a non-private LP system, and then adds noise.
Privacy-Preserving Outlier Detection for Data Streams
TLDR
This paper contributes an algorithm that combines local, differentially private data perturbation of sensor streams with highly accurate outlier detection, and evaluates the algorithm on synthetic data.
Advances and Open Problems in Federated Learning
TLDR
Motivated by the explosive growth in FL research, this paper discusses recent advances and presents an extensive collection of open problems and challenges.

References

SHOWING 1-10 OF 15 REFERENCES
The Algorithmic Foundations of Differential Privacy
TLDR
The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example.
Differentially private data analysis of social networks via restricted sensitivity
TLDR
Using restricted sensitivity, the notion of restricted sensitivity as an alternative to global and smooth sensitivity to improve accuracy in differentially private data analysis is introduced and the usefulness of this notion is demonstrated by considering the task of answering queries regarding social-networks, which is a combination of a graph and a labeling of its vertices.
Coupled-Worlds Privacy: Exploiting Adversarial Uncertainty in Statistical Data Privacy
TLDR
A new framework for defining privacy in statistical databases that enables reasoning about and exploiting adversarial uncertainty about the data and shows that several natural, "noiseless" mechanisms satisfy the definitional framework under realistic assumptions on the distribution of the underlying data.
Analyzing Graphs with Node Differential Privacy
TLDR
A generic, efficient reduction is derived that allows us to apply any differentially private algorithm for bounded-degree graphs to an arbitrary graph, based on analyzing the smooth sensitivity of the 'naive' truncation that simply discards nodes of high degree.
Calibrating Noise to Sensitivity in Private Data Analysis
TLDR
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.
Mechanism design in large games: incentives and privacy
TLDR
The main technical result is an algorithm for computing a correlated equilibrium of a large game while satisfying joint differential privacy, which ends up satisfying a strong privacy property as well.
Accurate Estimation of the Degree Distribution of Private Networks
TLDR
An efficient algorithm for releasing a provably private estimate of the degree distribution of a network, showing that the algorithm's variance and bias is low, that the error diminishes as the size of the input graph increases, and that common analyses like fitting a power-law can be carried out very accurately.
Boosting and Differential Privacy
TLDR
This work obtains an $O(\eps^2) bound on the {\em expected} privacy loss from a single $\eps$-\dfp{} mechanism, and gets stronger bounds on the expected cumulative privacy loss due to multiple mechanisms, each of which provides $\eps-differential privacy or one of its relaxations, and each ofWhich operates on (potentially) different, adaptively chosen, databases.
Selective privacy guarantees
  • Selective privacy guarantees
Bulk Collection of Signals Intelligence: Technical Options. The National Academies Press
  • Bulk Collection of Signals Intelligence: Technical Options. The National Academies Press
  • 2015
...
...