Corpus ID: 18869250

Differential privacy for counting queries: can Bayes estimation help uncover the true value?

@article{Naldi2014DifferentialPF,
  title={Differential privacy for counting queries: can Bayes estimation help uncover the true value?},
  author={Maurizio Naldi and Giuseppe D'Acquisto},
  journal={ArXiv},
  year={2014},
  volume={abs/1407.0116}
}
Differential privacy is achieved by the introduction of Laplacian noise in the response to a query, establishing a precise trade-off between the level of differential privacy and the accuracy of the database response (via the amount of noise introduced). Multiple queries may improve the accuracy but erode the privacy budget. We examine the case where we submit just a single counting query. We show that even in that case a Bayesian approach may be used to improve the accuracy for the same amount… Expand
Differential Privacy: An Estimation Theory-Based Method for Choosing Epsilon
TLDR
This paper proposes to use two parameters related to the notion of interval estimation, which provide a more intuitive picture of how precisely the true output of a counting query may be gauged from the noise-polluted one. Expand
Mr X vs. Mr Y: The Emergence of Externalities in Differential Privacy
TLDR
It is shown that an attack on Mr. X can be conducted by an oracle, by computing the likelihood ratio under two scenarios, where the database population is made of either independent or correlated entries. Expand
A Conceptual Framework for Assessing Anonymization-Utility Trade-Offs Based on Principal Component Analysis
  • G. D'Acquisto, M. Naldi
  • Computer Science
  • International journal of simulation: systems, science & technology
  • 2019
TLDR
An anonymization technique for databases is proposed that employs Principal Component Analysis, and alternative metrics are proposed to assess utility, based respectively on matrix norms; correlation coefficients; divergence measures, and quality indices of database images. Expand
Statistical Privacy for Streaming Traffic
TLDR
The adaption of techniques previously used in the domains of adversarial machine learning and differential privacy to mitigate the machine-learning-powered analysis of streaming traffic and proposes two mechanisms for enforcing differential privacy for encrypted streaming traffic. Expand
Option pricing in a privacy-aware market
  • M. Naldi, G. D'Acquisto
  • Computer Science
  • 2015 IEEE Conference on Communications and Network Security (CNS)
  • 2015
TLDR
This work provides a formula for the option price, which is found to be somewhat larger than the cost of the declared excess items. Expand
Mitigating Storage Side Channels Using Statistical Privacy Mechanisms
TLDR
This work brings advances in privacy for statistical databases to bear on storage side-channel defense, and demonstrates the feasibility of applying differentially private mechanisms to mitigate storage side channels in procfs, a pseudo file system broadly used in Linux and Android kernels. Expand
Option contracts for a privacy-aware market
TLDR
A market is envisaged where private information can be protected through the use of differential privacy and option contracts, while privacy-aware suppliers deliver their stock at a reduced price. Expand
Privacy Technologies and Policy
TLDR
This work analyzes the metadata of over one million apps from the Google Play Store to answer the question: “Which apps have privacy policies?” by analyzing the relationship between app metadata features and whether apps link to privacy policies. Expand
Privacy Technologies and Policy: 8th Annual Privacy Forum, APF 2020, Lisbon, Portugal, October 22–23, 2020, Proceedings
TLDR
A specialized methodological framework for carrying out a Data Protection Impact Assessment (DPIA) is proposed to enable controllers to assess and prevent ex ante the risk to the right to non-discrimination as one of the key fundamental rights that GDPR aims to safeguard. Expand
Protecting suppliers' private information: the case of stock levels and the impact of correlated items
A marketplace is defined where the private data of suppliers (e.g., prosumers) are protected, so that neither their identity nor their level of stock is made known to end customers, while they canExpand

References

SHOWING 1-10 OF 16 REFERENCES
Optimizing linear counting queries under differential privacy
TLDR
The matrix mechanism is proposed, a new algorithm for answering a workload of predicate counting queries and the problem of computing the optimal query strategy in support of a given workload can be formulated as a rank-constrained semidefinite program. Expand
Calibrating Noise to Sensitivity in Private Data Analysis
TLDR
The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output. Expand
Differential Privacy and Statistical Disclosure Risk Measures: An Investigation with Binary Synthetic Data
TLDR
An alternative disclosure risk assessment approach is presented that integrates some of the strong confidential- ity protection features in ϵ-differential privacy with the interpretability and data-specific nature of probabilistic disclosure risk measures. Expand
Evaluating Laplace Noise Addition to Satisfy Differential Privacy for Numeric Data
TLDR
The results indicate that Laplace noise addition delivers the promised level of privacy only by adding a large quantity of noise for even relatively large subsets, and raises serious questions regarding the viability of Laplace based noise addition for masking numeric data. Expand
Differential Privacy: A Survey of Results
TLDR
This survey recalls the definition of differential privacy and two basic techniques for achieving it, and shows some interesting applications of these techniques, presenting algorithms for three specific tasks and three general results on differentially private learning. Expand
Differential Privacy
TLDR
A general impossibility result is given showing that a formalization of Dalenius' goal along the lines of semantic security cannot be achieved, which suggests a new measure, differential privacy, which, intuitively, captures the increased risk to one's privacy incurred by participating in a database. Expand
Privacy-preserving data mining
TLDR
This work considers the concrete case of building a decision-tree classifier from training data in which the values of individual records have been perturbed and proposes a novel reconstruction procedure to accurately estimate the distribution of original data values. Expand
Security-control methods for statistical databases: a comparative study
TLDR
This paper recommends directing future research efforts toward developing new methods that prevent exact disclosure and provide statistical-disclosure control, while at the same time do not suffer from the bias problem and the 0/1 query-set-size problem. Expand
Third-party apps on Facebook: privacy and the illusion of control
TLDR
This research proposes two new interface designs for third-party apps' authentication dialogs to increase user control of apps' data access and restrict apps' publishing ability during the process of adding them to users' profiles, and alert users when their global privacy settings on Facebook are violated by apps. Expand
Hey, You, Get Off of My Market: Detecting Malicious Apps in Official and Alternative Android Markets
TLDR
A permissionbased behavioral footprinting scheme to detect new samples of known Android malware families and a heuristics-based filtering scheme to identify certain inherent behaviors of unknown malicious families are proposed. Expand
...
1
2
...