# Private Approximations of the 2nd-Moment Matrix Using Existing Techniques in Linear Regression

@article{Sheffet2015PrivateAO, title={Private Approximations of the 2nd-Moment Matrix Using Existing Techniques in Linear Regression}, author={Or Sheffet}, journal={ArXiv}, year={2015}, volume={abs/1507.00056} }

We introduce three differentially-private algorithms that approximates the 2nd-moment matrix of the data. These algorithm, which in contrast to existing algorithms output positive-definite matrices, correspond to existing techniques in linear regression literature. Specifically, we discuss the following three techniques. (i) For Ridge Regression, we propose setting the regularization coefficient so that by approximating the solution using Johnson-Lindenstrauss transform we preserve privacy. (ii…

## 24 Citations

### DIFFERENTIALLY PRIVATE SPARSE INVERSE COVARIANCE ESTIMATION

- Computer Science, Mathematics2018 IEEE Global Conference on Signal and Information Processing (GlobalSIP)
- 2018

The first results on the sparse inverse covariance estimation problem under the differential privacy model are presented and a general covariance perturbation method is introduced to achieve both ε-differentially privacy and (ε, δ)-differential privacy.

### A note on privacy preserving iteratively reweighted least squares

- MathematicsArXiv
- 2016

A practical algorithm that overcomes challenges of privacy preserving IRLS and applies the Concentrated differential privacy formalism, a more relaxed version of differential privacy, which requires adding a significantly less amount of noise for the same level of privacy guarantee, compared to the conventional and advanced compositions of differentially private mechanisms.

### Symmetric matrix perturbation for differentially-private principal component analysis

- Computer Science2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2016

This paper proposes a new algorithm for differentially-private computation of PCA and compares the performance empirically with some recent state-of-the-art algorithms on different data sets.

### Analysis of a privacy-preserving PCA algorithm using random matrix theory

- Computer Science2016 IEEE Global Conference on Signal and Information Processing (GlobalSIP)
- 2016

The privacy-preserving principal component algorithm proposed in [1] is a promising approach when a low rank data summarization is desired, but the analysis is limited to the case of a single principal component, which makes use of bounds on the vector-valued Bingham distribution in the unit sphere.

### Efficient Private Empirical Risk Minimization for High-dimensional Learning

- Computer ScienceICML
- 2016

This paper theoretically study the problem of differentially private empirical risk minimization in the projected subspace (compressed domain) of ERM problems, and shows that for the class of generalized linear functions, given only the projected data and the projection matrix, excess risk bounds can be obtained.

### Utility Preserving Secure Private Data Release

- Computer ScienceArXiv
- 2019

A private data release mechanism that makes reconstruction of the original data impossible and also preserves utility for a wide range of machine learning algorithms is designed by combining the JL transform with noise generated from a Laplace distribution.

### Differentially Private Ridge Regression

- Computer Science
- 2021

Through the experimental results, it is found that the performance is often insensitive to the particular privacy- Loss budgeting and that for certain datasets, no choice of privacy-loss budget allows for the adaptive adaSSPbudget to outperform the standard SSP algorithm.

### D S ] 1 5 M ar 2 01 9 Utility Preserving Secure Private Data Release

- Computer Science
- 2019

A private data release mechanism that makes reconstruction of the original data impossible and also preserves utility for a wide range of machine learning algorithms is designed by combining the JL transform with noise generated from a Laplace distribution.

### Distributed Sketching for Randomized Optimization: Exact Characterization, Concentration and Lower Bounds

- Computer ScienceArXiv
- 2022

This work develops unbiased parameter averaging methods for randomized second order optimization for regularized problems that employ sketching of the Hessian and provides closed-form formulas for regularization parameters and step sizes that provably minimize the bias for sketched Newton directions.

### Differentially Private Contextual Linear Bandits

- Computer ScienceNeurIPS
- 2018

This paper gives a general scheme converting the classic linear-UCB algorithm into a joint differentially private algorithm using the tree-based algorithm and gives the first lower bound on the additional regret any private algorithms for the MAB problem must incur.

## References

SHOWING 1-10 OF 28 REFERENCES

### Robust subspace iteration and privacy-preserving spectral analysis

- Computer Science, Mathematics2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton)
- 2013

A new robust convergence analysis of the well-known subspace iteration algorithm for computing the dominant singular vectors of a matrix, also known as simultaneous iteration or power method is discussed, showing that the error dependence of the algorithm on the matrix dimension can be replaced by a tight dependence on the coherence of the matrix.

### Analyze gauss: optimal bounds for privacy-preserving principal component analysis

- Computer ScienceSTOC
- 2014

It is shown that the well-known, but misnamed, randomized response algorithm provides nearly optimal additive quality gap compared to the best possible singular subspace of A, and that when ATA has a large eigenvalue gap -- a reason often cited for PCA -- the quality improves significantly.

### Differentially Private Linear Algebra in the Streaming Model

- Computer Science, MathematicsIACR Cryptol. ePrint Arch.
- 2014

This paper gives the first sketch-based algorithm for differential privacy, optimal, up to logarithmic factor, space data-structures that can compute low rank approximation, linear regression, and matrix multiplication, while preserving differential privacy with better additive error bounds compared to the known results.

### On differentially private low rank approximation

- Computer ScienceSODA
- 2013

This paper gives a polynomial time algorithm that, given a privacy parameter e > 0, for a symmetric matrix A, outputs an e-differentially approximation to the principal eigenvector of A, and shows how this algorithm can be used to obtain a differentially private rank-k approximation.

### Private Convex Optimization for Empirical Risk Minimization with Applications to High-dimensional Regression

- Computer Science, MathematicsCOLT
- 2012

This work significantly extends the analysis of the “objective perturbation” algorithm of Chaudhuri et al. (2011) for convex ERM problems, and gives the best known algorithms for differentially private linear regression.

### Beating randomized response on incoherent matrices

- Computer ScienceSTOC '12
- 2012

This work gives (the first) significant improvements in accuracy over randomized response under the natural and necessary assumption that the matrix has low coherence.

### Privacy for Free: Posterior Sampling and Stochastic Gradient Monte Carlo

- Computer ScienceICML
- 2015

It is shown that under standard assumptions, getting one sample from a posterior distribution is differentially private "for free"; and this sample as a statistical estimator is often consistent, near optimal, and computationally tractable; and this observations lead to an "anytime" algorithm for Bayesian learning under privacy constraint.

### The Differential Privacy of Bayesian Inference

- Computer Science
- 2015

It is found that while differential privacy is ostensibly achievable for most of the method variants, the conditions needed for it to do so are often not realistic for practical usage.

### Improved Approximation Algorithms for Large Matrices via Random Projections

- Computer Science2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06)
- 2006

The key idea is that low dimensional embeddings can be used to eliminate data dependence and provide more versatile, linear time pass efficient matrix computation.

### The Johnson-Lindenstrauss Transform Itself Preserves Differential Privacy

- Computer Science, Mathematics2012 IEEE 53rd Annual Symposium on Foundations of Computer Science
- 2012

This paper proves that an "old dog", namely - the classical Johnson-Lindenstrauss transform, "performs new tricks" - it gives a novel way of preserving differential privacy. We show that if we take…