• Corpus ID: 224704510

Locally Differentially Private Analysis of Graph Statistics

  title={Locally Differentially Private Analysis of Graph Statistics},
  author={Jacob Imola and Takao Murakami and Kamalika Chaudhuri},
  booktitle={USENIX Security Symposium},
Differentially private analysis of graphs is widely used for releasing statistics from sensitive graphs while still preserving user privacy. Most existing algorithms however are in a centralized privacy model, where a trusted data curator holds the entire graph. As this model raises a number of privacy and security issues -- such as, the trustworthiness of the curator and the possibility of data breaches, it is desirable to consider algorithms in a more decentralized local model where no server… 

Differentially Private Binary- and Matrix-Valued Data Query: An XOR Mechanism

Experimental results show that the XOR mechanism notably outperforms other state-of-the-art differentially private methods in terms of utility, and even achieves comparable utility to the non-private mechanisms.

Communication-Efficient Triangle Counting under Local Differential Privacy

This work proposes two-rounds algorithms consisting of edge sampling and carefully selecting edges each user downloads so that the estimation error is small, and proposes a double clipping technique, which clips the number of edges and then thenumber of noisy triangles, to reduce the sensitivity of each user’s query.

A Crypto-Assisted Approach for Publishing Graph Statistics with Node Local Differential Privacy

This paper proposes an algorithm to publish the degree distribution with Node-LDP by exploring how to select the graph projection parameter in the local setting and how to execute thegraph projection locally, and proposes a crypto-assisted local projection method based on cryptographic primitives, achieving the higher accuracy than the authors' baseline pureLDP local projection Method.

SoK: Differential Privacy on Graph-Structured Data

This work systematise different formulations of DP on graphs, discuss challenges and promising applications, including the GNN domain, and compares and separate works into graph analysis tasks and graph learning tasks with GNNs.

Towards Private Learning on Decentralized Graphs With Local Differential Privacy

Solitude is a new privacy-preserving learning framework based on graph neural networks (GNNs), with formal privacy guarantees based on edge local differential privacy, that can simultaneously protect node feature privacy and edge privacy, and can seamlessly incorporate with any GNN with privacy-utility guarantees.

OPTT: Optimal Piecewise Transformation Technique for Analyzing Numerical Data under Local Differential Privacy

A principled framework for PTT in the context of LDP is provided, based on which PTT is studied systematically, and it is proved that for a family of PTTs, the correspondingly theoretical low bound of noisy variance follows O( −2) when considering the high privacy level.

Differentially Private Subgraph Counting in the Shuffle Model

This paper proposes accurate subgraph counting algorithms by introducing a recently studied shuffle model and shows that they significantly outperform the one-round local algorithms in terms of accuracy and achieve small estimation errors with a reasonable privacy budget, e.g., smaller than 1 in edge DP.

Differentially Private Triangle and 4-Cycle Counting in the Shuffle Model

This paper proposes accurate subgraph counting algorithms by introducing a recently studied shuffle model and shows that these algorithms significantly outperform the one-round local algorithms in terms of accuracy and achieve small estimation errors with a reasonable privacy budget, e.g., smaller than 1 in edge DP.

Graph Analysis in Decentralized Online Social Networks with Fine-Grained Privacy Protection

This work designs a FGR-DP notion for social graph analysis, which enforces different protections for the edges with distinct privacy requirements, and designs algorithms for triangle counting and k -stars counting, respectively, which can accurately estimate subgraph counts given social edges protection.

Differential Privacy from Locally Adjustable Graph Algorithms: k-Core Decomposition, Low Out-Degree Ordering, and Densest Subgraphs

This paper defines locally adjustable graph algorithms and shows that algorithms of this type can be transformed into differentially private algorithms, and presents an $\varepsilon$-locally edge differentiallyPrivate (LEDP) algorithm for k-core decompositions.



Private Analysis of Graph Structure

This work extends the approach of Nissim et al. to a new class of statistics, namely k-star queries, and gives hardness results indicating that the approach used for triangles cannot easily be extended to k-triangles.

Analyzing Graphs with Node Differential Privacy

A generic, efficient reduction is derived that allows us to apply any differentially private algorithm for bounded-degree graphs to an arbitrary graph, based on analyzing the smooth sensitivity of the 'naive' truncation that simply discards nodes of high degree.

Differentially Private Continual Release of Graph Statistics

This work shows that if there is a publicly known upper bound on the maximum degree of any node in the entire network sequence, then many common graph statistics such as degree distributions and subgraph counts continually with a better privacy-accuracy tradeoff are released.

Generating Synthetic Decentralized Social Graphs with Local Differential Privacy

This paper proposes LDPGen, a novel multi-phase technique that incrementally clusters users based on their connections to different partitions of the whole population, and derives optimal parameters in this process to cluster structurally-similar users together.

Towards Locally Differentially Private Generic Graph Metric Estimation

L LF-GDPR simplifies the job of implementing LDP-related steps for a graph metric estimation task by providing either a complete or a parameterized algorithm for each step.

Extremal Mechanisms for Local Differential Privacy

It is shown that for all information theoretic utility functions studied in this paper, maximizing utility is equivalent to solving a linear program, the outcome of which is the optimal staircase mechanism, which is universally optimal in the high and low privacy regimes.

Preserving Differential Privacy in Degree-Correlation based Graph Generation

Empirical evaluations show the developed private dK-graph generation models significantly outperform the approach based on the stochastic Kronecker generation model and achieve the strict differential privacy guarantee with smaller magnitude noise.

Local, Private, Efficient Protocols for Succinct Histograms

Efficient protocols and matching accuracy lower bounds for frequency estimation in the local model for differential privacy are given and it is shown that each user need only send 1 bit to the server in a model with public coins.

The Algorithmic Foundations of Differential Privacy

The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example.

Utility-Optimized Local Differential Privacy Mechanisms for Distribution Estimation

The notion of ULDP (Utility-optimized LDP), which provides a privacy guarantee equivalent to LDP only for sensitive data, is introduced and it is shown that when most of the data are non-sensitive, the mechanisms provide almost the same utility as non-private mechanisms in the low privacy regime.