Decentralized Statistical Inference with Unrolled Graph Neural Networks

@article{Wang2021DecentralizedSI,
  title={Decentralized Statistical Inference with Unrolled Graph Neural Networks},
  author={He Wang and Yifei Shen and Ziyuan Wang and Dongsheng Li and Jun Zhang and Khaled Ben Letaief and Jie Lu},
  journal={2021 60th IEEE Conference on Decision and Control (CDC)},
  year={2021},
  pages={2634-2640}
}
  • He Wang, Yifei Shen, Jie Lu
  • Published 4 April 2021
  • Computer Science
  • 2021 60th IEEE Conference on Decision and Control (CDC)
In this paper, we investigate the decentralized statistical inference problem, where a network of agents cooperatively recover a (structured) vector from private noisy samples without centralized coordination. Existing optimization-based algorithms suffer from issues of model mismatches and poor convergence speed, and thus their performance would be degraded provided that the number of communication rounds is limited. This motivates us to propose a learning-based framework, which unrolls well… 

Figures and Tables from this paper

AI Empowered Resource Management for Future Wireless Networks
TLDR
For K-user interference management problem, it is theoretically show that graph neural networks (GNNs) are superior to multi-layer perceptrons (MLPs), and the performance gap between these two methods grows with $\sqrt K $.
Scalable Power Control/Beamforming in Heterogeneous Wireless Networks with Graph Neural Networks
TLDR
A novel unsupervised learning-based framework named heterogeneous interference graph neural network (HIGNN) is proposed to empower each link to obtain its individual transmission scheme after limited information exchange with neighboring links.

References

SHOWING 1-10 OF 31 REFERENCES
Graph Neural Networks for Scalable Radio Resource Management: Architecture Design and Theoretical Analysis
TLDR
This paper demonstrates that radio resource management problems can be formulated as graph optimization problems that enjoy a universal permutation equivariance property, and identifies a family of neural networks, named message passing graph neural networks (MPGNNs), which can generalize to large-scale problems, while enjoying a high computational efficiency.
GLAD: Learning Sparse Graph Recovery
TLDR
A deep learning architecture, GLAD, is proposed, which uses an Alternating Minimization algorithm as the authors' model inductive bias, and learns the model parameters via supervised learning, and it is shown that GLAD learns a very compact and effective model for recovering sparse graphs from data.
A Unified Algorithmic Framework for Distributed Composite Optimization
TLDR
A by-product of the analysis is a tuning recommendation for several existing (non-accelerated) distributed algorithms yielding the fastest provable (worst-case) convergence rate.
A Proximal Gradient Algorithm for Decentralized Composite Optimization
TLDR
A proximal gradient exact first-order algorithm (PG-EXTRA) that utilizes the composite structure and has the best known convergence rate and is a nontrivial extension to the recent algorithm EXTRA.
A Decentralized Proximal-Gradient Method With Network Independent Step-Sizes and Separated Convergence Rates
TLDR
This paper proposes a novel proximal-gradient algorithm for a decentralized optimization problem with a composite objective containing smooth and nonsmooth terms that is as good as one of the two convergence rates that match the typical rates for the general gradient descent and the consensus averaging.
COLA: Decentralized Linear Learning
TLDR
This work proposes COLA, a new decentralized training algorithm with strong theoretical guarantees and superior practical performance, that achieves communication efficiency, scalability, elasticity as well as resilience to changes in data and allows for unreliable and heterogeneous participating devices.
Graph Neural Networks for Decentralized Controllers
TLDR
This paper proposes a framework using graph neural networks (GNNs) to learn decentralized controllers from data, and finds that GNNs are naturally distributed architectures, making them perfectly suited for the task, and adapt them to handle delayed communications as well.
On Nonconvex Decentralized Gradient Descent
TLDR
Somewhat surprisingly, the decentralized consensus algorithms, DGD and Prox-DGD, retain most other properties that are known in the convex setting, and can take the constraint to a nonconvex set with an easy projection.
How Powerful are Graph Neural Networks?
TLDR
This work characterize the discriminative power of popular GNN variants, such as Graph Convolutional Networks and GraphSAGE, and show that they cannot learn to distinguish certain simple graph structures, and develops a simple architecture that is provably the most expressive among the class of GNNs.
...
1
2
3
4
...