# Privacy Against Brute-Force Inference Attacks

@article{Ossia2019PrivacyAB, title={Privacy Against Brute-Force Inference Attacks}, author={Seyed Ali Ossia and Borzoo Rassouli and Hamed Haddadi and Hamid R. Rabiee and Deniz G{\"u}nd{\"u}z}, journal={2019 IEEE International Symposium on Information Theory (ISIT)}, year={2019}, pages={637-641} }

Privacy-preserving data release is about disclosing information about useful data while retaining the privacy of sensitive data. Assuming that the sensitive data is threatened by a brute-force adversary, we define Guessing Leakage as a measure of privacy, based on the concept of guessing. After investigating the properties of this measure, we derive the optimal utility-privacy trade-off via a linear program with any f-information adopted as the utility measure, and show that the optimal utility…

## 6 Citations

Privacy-Utility Tradeoff and Privacy Funnel

- 2020

We consider a privacy-utility trade-off encountered by users who wish to disclose some information to an analyst, that is correlated with their private data, in the hope of receiving some utility. We…

Active Privacy-Utility Trade-Off Against A Hypothesis Testing Adversary

- Computer Science, MathematicsICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2021

This work considers active sequential data release, where at each time step the user chooses from among a finite set of release mechanisms, each revealing some information about the user’s personal information, and forms a Markov decision process (MDP) and numerically solves both problems by advantage actor-critic (A2C) deep reinforcement learning (RL).

On the Robustness of Information-Theoretic Privacy Measures and Mechanisms

- Computer Science, MathematicsIEEE Transactions on Information Theory
- 2020

It is proved that the optimal privacy mechanisms for the empirical distribution approach the corresponding mechanism for the true distribution as the sample size of the privacy mechanism increases, thereby establishing the statistical consistency of the optimalprivacy mechanisms.

Review of results on smart‐meter privacy by data manipulation, demand shaping, and load scheduling

- Computer ScienceIET Smart Grid
- 2020

Several methods relying on homomorphic encryption, differential privacy, information theory, and statistics for ensuring privacy are presented for ensuring smart-meter privacy.

A Hybrid Deep Learning Architecture for Privacy-Preserving Mobile Analytics

- Computer ScienceIEEE Internet of Things Journal
- 2020

This article presents a hybrid approach for breaking down large, complex deep neural networks for cooperative, and privacy-preserving analytics, and shows that by using Siamese fine-tuning and at a small processing cost, this approach can greatly reduce the level of unnecessary, potentially sensitive information in the personal data.

Measuring Information Leakage in Non-stochastic Brute-Force Guessing

- Computer Science, Engineering2020 IEEE Information Theory Workshop (ITW)
- 2021

An operational measure of information leakage in a non-stochastic setting to formalize privacy against a brute-force guessing adversary and investigates the relationship between the newly-developed measure with maximin information and stochastic maximal leakage that are shown to arise in one-shot guessing.

## References

SHOWING 1-10 OF 21 REFERENCES

Privacy against statistical inference

- Computer Science, Mathematics2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
- 2012

It is proved that under both metrics the resulting design problem of finding the optimal mapping from the user's data to a privacy-preserving output can be cast as a modified rate-distortion problem which, in turn, can be formulated as a convex program.

Optimal Utility-Privacy Trade-off with Total Variation Distance as a Privacy Measure

- Computer Science, Mathematics2018 IEEE Information Theory Workshop (ITW)
- 2018

The total variation distance is proposed as a privacy measure in an information disclosure scenario when the goal is to reveal some information about available data in order to receive utility, while…

From the Information Bottleneck to the Privacy Funnel

- Computer Science, Mathematics2014 IEEE Information Theory Workshop (ITW 2014)
- 2014

It is shown that the privacy-utility tradeoff under the log-loss can be cast as the non-convex Privacy Funnel optimization, and its connection to the Information Bottleneck is Leveraged, to provide a greedy algorithm that is locally optimal.

Maximizing Privacy under Data Distortion Constraints in Noise Perturbation Methods

- Mathematics, Computer SciencePinKDD
- 2008

This work addresses an important shortcoming of noise perturbation methods, by providing them with an intuitive definition of privacy analogous to the definition used in k-anonymity, and an analytical means for selecting parameters to achieve a desired level of privacy.

Privacy-Utility Tradeoffs under Constrained Data Release Mechanisms

- Computer Science, MathematicsArXiv
- 2017

This work studies how the optimal privacy-utility tradeoff region is affected by constraints on the data that is directly available as input to the release mechanism, and derives exact closed-analytic-form expressions for the privacy-UTility tradeoffs for symmetrically dependent sensitive and useful data under mutual information and Hamming distortion.

A Tunable Measure for Information Leakage

- Computer Science, Mathematics2018 IEEE International Symposium on Information Theory (ISIT)
- 2018

A tunable measure for information leakage called maximal a-leakage is introduced. This measure quantifies the maximal gain of an adversary in refining a tilted version of its prior belief of any…

Differential Privacy

- Computer ScienceICALP
- 2006

A general impossibility result is given showing that a formalization of Dalenius' goal along the lines of semantic security cannot be achieved, which suggests a new measure, differential privacy, which, intuitively, captures the increased risk to one's privacy incurred by participating in a database.

k-Anonymity: A Model for Protecting Privacy

- Computer ScienceInt. J. Uncertain. Fuzziness Knowl. Based Syst.
- 2002

The solution provided in this paper includes a formal protection model named k-anonymity and a set of accompanying policies for deployment and examines re-identification attacks that can be realized on releases that adhere to k- anonymity unless accompanying policies are respected.

Quantifying computational security subject to source constraints, guesswork and inscrutability

- Mathematics, Computer Science2015 IEEE International Symposium on Information Theory (ISIT)
- 2015

It is proved that the inscrutability rate of any string-source supported on a finite alphabet χ, if it exists, lies between the per-symbol Shannon entropy constraint and log |χ|.

Multi-User Guesswork and Brute Force Security

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2015

It is established that, unless U=V, there is no general strategy that minimizes the distribution of the number of guesses, but in the asymptote as the strings become long, which provides a bound on computational security for multi-user systems.