Decentralized Statistical Inference with Unrolled Graph Neural Networks

@article{Wang2021DecentralizedSI,
title={Decentralized Statistical Inference with Unrolled Graph Neural Networks},
author={He Wang and Yifei Shen and Ziyuan Wang and Dongsheng Li and Jun Zhang and Khaled Ben Letaief and Jie Lu},
journal={2021 60th IEEE Conference on Decision and Control (CDC)},
year={2021},
pages={2634-2640}
}
• Published 4 April 2021
• Computer Science
• 2021 60th IEEE Conference on Decision and Control (CDC)
In this paper, we investigate the decentralized statistical inference problem, where a network of agents cooperatively recover a (structured) vector from private noisy samples without centralized coordination. Existing optimization-based algorithms suffer from issues of model mismatches and poor convergence speed, and thus their performance would be degraded provided that the number of communication rounds is limited. This motivates us to propose a learning-based framework, which unrolls well…
2 Citations

Figures and Tables from this paper

AI Empowered Resource Management for Future Wireless Networks
• Computer Science
2021 IEEE International Mediterranean Conference on Communications and Networking (MeditCom)
• 2021
For K-user interference management problem, it is theoretically show that graph neural networks (GNNs) are superior to multi-layer perceptrons (MLPs), and the performance gap between these two methods grows with $\sqrt K$.
Scalable Power Control/Beamforming in Heterogeneous Wireless Networks with Graph Neural Networks
• Computer Science
2021 IEEE Global Communications Conference (GLOBECOM)
• 2021
A novel unsupervised learning-based framework named heterogeneous interference graph neural network (HIGNN) is proposed to empower each link to obtain its individual transmission scheme after limited information exchange with neighboring links.

References

SHOWING 1-10 OF 31 REFERENCES
Graph Neural Networks for Scalable Radio Resource Management: Architecture Design and Theoretical Analysis
• Computer Science
IEEE Journal on Selected Areas in Communications
• 2021
This paper demonstrates that radio resource management problems can be formulated as graph optimization problems that enjoy a universal permutation equivariance property, and identifies a family of neural networks, named message passing graph neural networks (MPGNNs), which can generalize to large-scale problems, while enjoying a high computational efficiency.
• Computer Science
ICLR
• 2020
A deep learning architecture, GLAD, is proposed, which uses an Alternating Minimization algorithm as the authors' model inductive bias, and learns the model parameters via supervised learning, and it is shown that GLAD learns a very compact and effective model for recovering sparse graphs from data.
A Unified Algorithmic Framework for Distributed Composite Optimization
• Computer Science
2020 59th IEEE Conference on Decision and Control (CDC)
• 2020
A by-product of the analysis is a tuning recommendation for several existing (non-accelerated) distributed algorithms yielding the fastest provable (worst-case) convergence rate.
A Proximal Gradient Algorithm for Decentralized Composite Optimization
• Computer Science, Mathematics
IEEE Transactions on Signal Processing
• 2015
A proximal gradient exact first-order algorithm (PG-EXTRA) that utilizes the composite structure and has the best known convergence rate and is a nontrivial extension to the recent algorithm EXTRA.
A Decentralized Proximal-Gradient Method With Network Independent Step-Sizes and Separated Convergence Rates
• Computer Science
IEEE Transactions on Signal Processing
• 2019
This paper proposes a novel proximal-gradient algorithm for a decentralized optimization problem with a composite objective containing smooth and nonsmooth terms that is as good as one of the two convergence rates that match the typical rates for the general gradient descent and the consensus averaging.
COLA: Decentralized Linear Learning
• Computer Science
NeurIPS
• 2018
This work proposes COLA, a new decentralized training algorithm with strong theoretical guarantees and superior practical performance, that achieves communication efficiency, scalability, elasticity as well as resilience to changes in data and allows for unreliable and heterogeneous participating devices.
Graph Neural Networks for Decentralized Controllers
• Computer Science
ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
• 2021
This paper proposes a framework using graph neural networks (GNNs) to learn decentralized controllers from data, and finds that GNNs are naturally distributed architectures, making them perfectly suited for the task, and adapt them to handle delayed communications as well.