Local differential privacy for physical sensor data and sparse recovery

@article{Gilbert2017LocalDP,
  title={Local differential privacy for physical sensor data and sparse recovery},
  author={Anna C. Gilbert and Audra McMillan},
  journal={2018 52nd Annual Conference on Information Sciences and Systems (CISS)},
  year={2017},
  pages={1-6}
}
  • A. GilbertAudra McMillan
  • Published 31 May 2017
  • Mathematics
  • 2018 52nd Annual Conference on Information Sciences and Systems (CISS)
In this work, we exploit the ill-posedness of linear inverse problems to design algorithms to release differentially private data or measurements of the physical system. We discuss the spectral requirements on a matrix such that only a small amount of noise is needed to achieve privacy and contrast this with the ill-conditionedness. We then instantiate our framework with several diffusion operators and explore recovery via constrained minimisation. Our work indicates that it is possible to… 

Figures and Tables from this paper

On Sparse Linear Regression in the Local Differential Privacy Model

This paper shows that polynomial dependency on the dimensionality of the space is unavoidable for the estimation error in both non-interactive and sequential interactive local models, and shows that differential privacy in high dimensional space is unlikely achievable for the problem.

High Dimensional Sparse Linear Regression under Local Differential Privacy: Power and Limitations

  • Di Wang
  • Computer Science, Mathematics
  • 2018
It is shown that polynomial dependency on the dimensionality p of the space is unavoidable in the estimation error under the non-interactive local model, if the privacy of the whole dataset needs to be preserved, and that differential privacy in high dimensional space is unlikely achievable for the problem.

Reviewing and Improving the Gaussian Mechanism for Differential Privacy

The utilities of the mechanisms improve those of [1,2] and are close to that of the optimal yet more computationally expensive Gaussian mechanism.

DP-LSSGD: A Stochastic Optimization Method to Lift the Utility in Privacy-Preserving ERM

A DP Laplacian smoothing SGD (DP-LSSGD) to train ML models with differential privacy (DP) guarantees that makes training both convex and nonconvex ML models more stable and enables the trained models to generalize better.

Differential Privacy, Property Testing, and Perturbations

This document is intended to be used for educational purposes only and should not be relied on as a guide for making informed decisions about major decisions about copyrighted material.

References

SHOWING 1-10 OF 50 REFERENCES

Genomic Data Privacy Protection using Compressed Sensing

This article presents a privacy preserving genomic data dissemination algorithm based on compressed sensing that is adding noise to the sparse representation of the input vector to make it differentially private.

Compressive mechanism: utilizing sparse representation in differential privacy

The amount of noise is significantly reduced when the noise insertion procedure is carried on the synopsis samples instead of the original database, and the proposed compressive mechanism is applied to solve the problem of continual release of statistical results.

Differential Location Privacy for Sparse Mobile Crowdsensing

E-differential-privacy is adopted in Sparse MCS to provide a theoretical guarantee for participants' location privacy regardless of an adversary's prior knowledge and to reduce the data quality loss caused by differential location obfuscation, a privacypreserving framework with three components.

Differential privacy and distributed online learning for wireless big data

This paper gives the sensor nodes the ability of online learning, which reduces the size of data storage by “using” the data, and uses differential privacy to solve the privacy-preserving problem.

Computational Differential Privacy

This work extends the dense model theorem of Reingold et al. to demonstrate equivalence between two definitions (indistinguishability- and simulatability-based) of computational differential privacy, and presents a differentially-private protocol for computing the distance between two vectors.

Extremal Mechanisms for Local Differential Privacy

It is shown that for all information theoretic utility functions studied in this paper, maximizing utility is equivalent to solving a linear program, the outcome of which is the optimal staircase mechanism, which is universally optimal in the high and low privacy regimes.

Calibrating Noise to Sensitivity in Private Data Analysis

The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.

Signal Processing and Machine Learning with Differential Privacy: Algorithms and Challenges for Continuous Data

Progress is described on differentially private machine learning and signal processing for privacy-preserving data analysis algorithms for signal processing.

Distributional differential privacy for large-scale smart metering

Novel differentially private mechanisms that solve the problem of how to protect parameters of the distribution of the query might still reveal sensitive personal information for sum queries are proposed.