# KNG: The K-Norm Gradient Mechanism

@article{Reimherr2019KNGTK, title={KNG: The K-Norm Gradient Mechanism}, author={Matthew L. Reimherr and Jordan Awan}, journal={ArXiv}, year={2019}, volume={abs/1905.09436} }

This paper presents a new mechanism for producing sanitized statistical summaries that achieve \emph{differential privacy}, called the \emph{K-Norm Gradient} Mechanism, or KNG. This new approach maintains the strong flexibility of the exponential mechanism, while achieving the powerful utility performance of objective perturbation. KNG starts with an inherent objective function (often an empirical risk), and promotes summaries that are close to minimizing the objective by weighting according to…

## 15 Citations

### Structure and Sensitivity in Differential Privacy: Comparing K-Norm Mechanisms

- Computer ScienceJournal of the American Statistical Association
- 2020

This work addresses the problem of releasing a noisy real-valued statistic vector T, a function of sensitive data under DP, via the class of K-norm mechanisms via the goal of minimizing the noise added to achieve privacy.

### Perturbed M-Estimation: A Further Investigation of Robust Statistics for Differential Privacy

- Computer ScienceArXiv
- 2021

The objective perturbation mechanism is modified by making use of a new bounded function and define a bounded M-Estimator with adequate statistical properties, which shows important potential in terms of improved statistical utility of its outputs as suggested by some preliminary results.

### Instance-optimality in differential privacy via approximate inverse sensitivity mechanisms

- Computer Science, MathematicsNeurIPS
- 2020

We study and provide instance-optimal algorithms in differential privacy by ex-tending and approximating the inverse sensitivity mechanism. We provide two approximation frameworks, one which only…

### Faster Differentially Private Samplers via Rényi Divergence Analysis of Discretized Langevin MCMC

- Computer Science, MathematicsNeurIPS
- 2020

This work establishes rapid convergence for differentially private algorithms under distance measures more suitable for differential privacy and gives the first results proving convergence in Renyi divergence for smooth, strongly-convex f.

### One Step to Efficient Synthetic Data

- Computer ScienceArXiv
- 2020

The approach allows for the construction of both partially synthetic datasets, which preserve the summary statistics without formal privacy methods, as well as fully synthetic data which satisfy the strong guarantee of differential privacy (DP), both with asymptotically efficient summary statistics.

### Exact Privacy Guarantees for Markov Chain Implementations of the Exponential Mechanism with Artificial Atoms

- Computer ScienceNeurIPS
- 2021

This paper proposes an additional modiﬁcation of this sampling algorithm that maintains its (cid:15) -DP guarantee and has improved runtime at the cost of some utility, and demonstrates that there is also a trade-off between privacy guarantees and runtime.

### Formal Privacy for Partially Private Data

- Computer Science
- 2022

This paper presents a privacy formalism, (cid:15) -DP relative to Z , extending Pufferﬁsh privacy, that accommodates DP-style semantics in the presence of public information and introduces two mechanisms for releasing partially private data (PPD), demonstrating theoretically and empirically how statistical inference from PPD degrades with postprocessing.

### Statistical Data Privacy: A Song of Privacy and Utility

- Computer ScienceAnnual Review of Statistics and Its Application
- 2022

The statistical foundations common to both SDC and DP are discussed, major developments in SDP are highlighted, and exciting open research problems in private inference are presented.

### Data Augmentation MCMC for Bayesian Inference from Privatized Data

- Computer Science, Mathematics
- 2022

This work proposes an MCMC framework to perform Bayesian inference from the privatized data, which is applicable to a wide range of statistical models and privacy mechanisms and illustrates the efﬁcacy and applicability of the methods on a naïve-Bayes log-linear model as well as on a linear regression model.

### Privacy-Aware Rejection Sampling

- Computer Science, MathematicsArXiv
- 2021

This work proposes three modiﬁcations to the rejection sampling algorithm, with varying assumptions, to protect against timing attacks by making the runtime independent of the data, and proposes an approximate sampler, introducing a small increase in the privacy cost.

## References

SHOWING 1-10 OF 39 REFERENCES

### Structure and Sensitivity in Differential Privacy: Comparing K-Norm Mechanisms

- Computer ScienceJournal of the American Statistical Association
- 2020

This work addresses the problem of releasing a noisy real-valued statistic vector T, a function of sensitive data under DP, via the class of K-norm mechanisms via the goal of minimizing the noise added to achieve privacy.

### Benefits and Pitfalls of the Exponential Mechanism with Applications to Hilbert Spaces and Functional PCA

- Computer Science, MathematicsICML
- 2019

A positive result is provided that establishes a Central Limit Theorem for the exponential mechanism quite broadly and the magnitude of the noise introduced for privacy is asymptotically non-negligible relative to the statistical estimation error.

### Differential Privacy without Sensitivity

- Computer Science, MathematicsNIPS
- 2016

This paper extends the classical exponential mechanism, allowing the loss functions to have an unbounded sensitivity of the loss function.

### Private Convex Empirical Risk Minimization and High-dimensional Regression

- Computer Science, MathematicsCOLT 2012
- 2012

This work significantly extends the analysis of the “objective perturbation” algorithm of Chaudhuri et al. (2011) for convex ERM problems, and gives the best known algorithms for differentially private linear regression.

### Functional Mechanism: Regression Analysis under Differential Privacy

- Computer ScienceProc. VLDB Endow.
- 2012

The main idea is to enforce e-differential privacy by perturbing the objective function of the optimization problem, rather than its results, and it significantly outperforms existing solutions.

### Privacy-preserving logistic regression

- Computer ScienceNIPS
- 2008

This paper addresses the important tradeoff between privacy and learnability, when designing algorithms for learning from private databases by providing a privacy-preserving regularized logistic regression algorithm based on a new privacy- Preserving technique.

### A near-optimal algorithm for differentially-private principal components

- Computer ScienceJ. Mach. Learn. Res.
- 2013

This paper investigates the theory and empirical performance of differentially private approximations to PCA and proposes a new method which explicitly optimizes the utility of the output and shows that on real data there is a large performance gap between the existing method and this method.

### Differentially Private Empirical Risk Minimization

- Computer ScienceJ. Mach. Learn. Res.
- 2011

This work proposes a new method, objective perturbation, for privacy-preserving machine learning algorithm design, and shows that both theoretically and empirically, this method is superior to the previous state-of-the-art, output perturbations, in managing the inherent tradeoff between privacy and learning performance.

### Calibrating Noise to Sensitivity in Private Data Analysis

- Computer ScienceTCC
- 2006

The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.

### Privacy-preserving statistical estimation with optimal convergence rates

- Mathematics, Computer ScienceSTOC '11
- 2011

It is shown that for a large class of statistical estimators T and input distributions P, there is a differentially private estimator AT with the same asymptotic distribution as T, which implies that AT (X) is essentially as good as the original statistic T(X) for statistical inference, for sufficiently large samples.