• Corpus ID: 248811155

Network change point localisation under local differential privacy

@inproceedings{Li2022NetworkCP,
  title={Network change point localisation under local differential privacy},
  author={Mengchu Li and Thomas B. Berrett and Yi Yu},
  year={2022}
}
Network data are ubiquitous in our daily life, containing rich but often sensitive information. In this paper, we expand the current static analysis of privatised networks to a dynamic framework by considering a sequence of networks with potential change points. We investigate the fundamental limits in consistently localising change points under both node and edge privacy constraints, demon-strating interesting phase transition in terms of the signal-to-noise ratio condition, accompanied by… 
1 Citations

Figures from this paper

On robustness and local differential privacy

This work is the first to systematically study the connections between the optimality under Huber’s contamination model and the local differential privacy (LDP) constraints, and showcases a promising prospect of joint study for robustness and local di-erential privacy.

References

SHOWING 1-10 OF 49 REFERENCES

Towards Private Learning on Decentralized Graphs With Local Differential Privacy

Solitude is a new privacy-preserving learning framework based on graph neural networks (GNNs), with formal privacy guarantees based on edge local differential privacy, that can simultaneously protect node feature privacy and edge privacy, and can seamlessly incorporate with any GNN with privacy-utility guarantees.

Analyzing Graphs with Node Differential Privacy

A generic, efficient reduction is derived that allows us to apply any differentially private algorithm for bounded-degree graphs to an arbitrary graph, based on analyzing the smooth sensitivity of the 'naive' truncation that simply discards nodes of high degree.

Locally Differentially Private Analysis of Graph Statistics

This work considers a local model, and presents algorithms for counting subgraphs -- a fundamental task for analyzing the connection patterns in a graph -- with LDP (Local Differential Privacy), and shows that it is indeed possible to accurately estimate subgraph counts in the local differential privacy model.

Edge differentially private estimation in the $\beta$-model via jittering and method of moments

An in-depth study of the trade-off between level of privacy and efficiency of statistical inference in the β-model for edge differentially private network data released via jittering, and provides a novel adaptive bootstrap procedure to construct uniform inference across different phases.

On robustness and local differential privacy

This work is the first to systematically study the connections between the optimality under Huber’s contamination model and the local differential privacy (LDP) constraints, and showcases a promising prospect of joint study for robustness and local di-erential privacy.

Differentially Private Community Detection for Stochastic Block Models

The main findings are that stability and sampling based mechanisms lead to a superior tradeoff between ( p, q ) and the privacy budget ( ϵ, δ ) ; however this comes at the expense of higher computational complexity.

Local differential privacy: Elbow effect in optimal density estimation and adaptation over Besov ellipsoids

A recent generalisation of classical minimax theory to the framework of local $\alpha$-differential privacy is adopted and a lower bound on the rate of convergence over Besov spaces under mean integrated $\mathbb L^r$-risk is provided.

Optimal change point detection and localization in sparse dynamic networks

This work proposes a computationally simple novel algorithm for network change point localization, called Network Binary Segmentation, which relies on weighted averages of the adjacency matrices, and devise a more sophisticated algorithm based on singular value thresholding, called Local Refinement, that delivers more accurate estimates of the change point locations.

Local Privacy, Data Processing Inequalities, and Statistical Minimax Rates

This work proves bounds on information-theoretic quantities, including mutual information and Kullback-Leibler divergence, that depend on the privacy guarantees, and provides a treatment of several canonical families of problems: mean estimation, parameter estimation in fixed-design regression, multinomial probability estimation, and nonparametric density estimation.

Inference using noisy degrees: Differentially private $\beta$-model and synthetic graphs

The paper addresses shortcomings of current approaches to a fundamental problem of how to perform valid statistical inference from data released by privacy mechanisms, and lays a foundational groundwork on how to achieve optimal and private statistical inference in a principled manner by modeling the privacy mechanism.