• Corpus ID: 233241010

FedGraphNN: A Federated Learning System and Benchmark for Graph Neural Networks

@article{He2021FedGraphNNAF,
  title={FedGraphNN: A Federated Learning System and Benchmark for Graph Neural Networks},
  author={Chaoyang He and Keshav Balasubramanian and Emir Ceyani and Yu Rong and Peilin Zhao and Junzhou Huang and Murali Annavaram and Salman Avestimehr},
  journal={ArXiv},
  year={2021},
  volume={abs/2104.07145}
}
Graph Neural Network (GNN) research is rapidly growing thanks to the capacity of GNNs in learning distributed representations from graph-structured data. However, centralizing a massive amount of real-world graph data for GNN training is prohibitive due to privacy concerns, regulation restrictions, and commercial competitions. Federated learning (FL), a trending distributed learning paradigm, provides possibilities to solve this challenge while preserving data privacy. Despite recent advances… 
SpreadGNN: Serverless Multi-task Federated Learning for Graph Neural Networks
TLDR
SpreadGNN is proposed, a novel multi-task federated training framework capable of operating in the presence of partial labels and absence of a central server for the first time in the literature, and demonstrates the efficacy of the framework on a variety of non-I.I.D. distributed graph-level molecular property prediction datasets with partial labels.
Federated Graph Learning - A Position Paper
TLDR
Considering how graph data are distributed among clients, a categorization of four types of FGL is proposed: intergraph FL, intra-graph FL and graph-structured FL, where intra- graph is further divided into horizontal and vertical FGL.
Federated Graph Classification over Non-IID Graphs
TLDR
This work proposes a graph clustered federated learning (GCFL) framework that dynamically finds clusters of local systems based on the gradients of GNNs, and theoretically justifies that such clusters can reduce the structure and feature heterogeneity among graphs owned by the local systems.
Federated Graph Machine Learning: A Survey of Concepts, Techniques, and Applications
TLDR
This survey conducts a comprehensive review of the literature in Federated Graph Machine Learning and provides a new taxonomy to divide the existing problems in FGML into two settings, namely, FL with structured data and structured FL .
FedGraph: Federated Graph Learning With Intelligent Sampling
TLDR
FedGraph is proposed, an intelligent graph sampling algorithm based on deep reinforcement learning, which can automatically converge to the optimal sampling policies that balance training speed and accuracy and significantly outperforms existing work by enabling faster convergence to higher accuracy.
FedGCN: Convergence and Communication Tradeoffs in Federated Training of Graph Convolutional Networks
TLDR
Federated Graph Convolutional Network (FedGCN) is introduced, which uses federated learning to train GCN models for semi-supervised node classification on large graphs with optimized convergence rate and communication cost.
FederatedScope-GNN: Towards a Unified, Comprehensive and Efficient Package for Federated Graph Learning
TLDR
This paper presents the implemented package F ederated S cope- G NN (FS-G), which provides a unified view for modularizing and expressing FGL algorithms and employs FS-G to serve the FGL application in real-world E-commerce scenarios, where the attained improvements indicate great potential business benefits.
A federated graph neural network framework for privacy-preserving personalization
TLDR
This work presents a federated GNN framework named FedPerGNN for both effective and privacy-preserving personalization, and introduces a privacy- Preserving graph expansion protocol to incorporate high-order information under privacy protection.
More is Better (Mostly): On the Backdoor Attacks in Federated Graph Neural Networks
TLDR
Two types of backdoor attacks in Federated GNNs are conducted: centralized backdoor attacks (CBA and distributed backdoor attacks) and the robustness of DBA and CBA against two state-of-the-art defenses is explored.
Federated Graph Contrastive Learning
TLDR
This paper investigates how to implement differential privacy on graph edges and observes the performances decreasing in the experiments, and proposes to leverage the advantages of graph contrastive learning to alleviate the performance dropping caused by differential privacy.
...
...

References

SHOWING 1-10 OF 111 REFERENCES
ASFGNN: Automated separated-federated graph neural network
TLDR
An Automated Separated-Federated Graph Neural Network (ASFGNN) learning paradigm is proposed, which decouples the training of GNN into two parts: the message passing part that is done by clients separately, and the loss computing part that was learnt by clients federally.
Federated Graph Classification over Non-IID Graphs
TLDR
This work proposes a graph clustered federated learning (GCFL) framework that dynamically finds clusters of local systems based on the gradients of GNNs, and theoretically justifies that such clusters can reduce the structure and feature heterogeneity among graphs owned by the local systems.
GraphFL: A Federated Learning Framework for Semi-Supervised Node Classification on Graphs
TLDR
This work proposes the first FL framework, namely GraphFL, for semi-supervised node classification on graphs and proposes two GraphFL methods to respectively address the non-IID issue in graph data and handle the tasks with new label domains, and designs a self-training method to leverage unlabeled graph data.
DistDGL: Distributed Graph Neural Network Training for Billion-Scale Graphs
  • Da Zheng, Chao Ma, G. Karypis
  • Computer Science
    2020 IEEE/ACM 10th Workshop on Irregular Applications: Architectures and Algorithms (IA3)
  • 2020
TLDR
The results show that DistDGL achieves linear speedup without compromising model accuracy and requires only 13 seconds to complete a training epoch for a graph with 100 million nodes and 3 billion edges on a cluster with 16 machines.
Privacy-Preserving Graph Neural Network for Node Classification
TLDR
This paper proposes a Privacy-Preserving GNN (PPGNN) learning paradigm for node classification task, which can be generalized to existing GNN models and demonstrates that PPGNN significantly outperforms theGNN models trained on the isolated data and has comparable performance with the traditional GNNtrained on the mixed plaintext data.
Subgraph Federated Learning with Missing Neighbor Generation
TLDR
Two major techniques are proposed, which train a GraphSage model based on FedAvg to integrate node features, link structures, and task labels on multiple local subgraphs; and FedSage+, which trains a missing neighbor generator along FedSages to deal with missing links across local sub graphs.
SGNN: A Graph Neural Network Based Federated Learning Approach by Hiding Structure
TLDR
A similarity-based graph neural network model, SGNN, is proposed, which captures the structure information of nodes precisely in node classification tasks and takes advantage of the thought of federated learning to hide the original information from different data sources to protect users’ privacy.
FedML: A Research Library and Benchmark for Federated Machine Learning
TLDR
FedML is introduced, an open research library and benchmark that facilitates the development of new federated learning algorithms and fair performance comparisons and can provide an efficient and reproducible means of developing and evaluating algorithms for the Federated learning research community.
Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge
TLDR
This work reformulates FL as a group knowledge transfer training algorithm, called FedGKT, which designs a variant of the alternating minimization approach to train small CNNs on edge nodes and periodically transfer their knowledge by knowledge distillation to a large server-side CNN.
Cross-Node Federated Graph Neural Network for Spatio-Temporal Data Modeling
TLDR
A federated spatio-temporal model -- Cross-Node Federated Graph Neural Network (CNFGNN) -- which explicitly encodes the underlying graph structure using graph neural network (GNN)-based architecture under the constraint of cross-node federated learning, which requires that data in a network of nodes is generated locally on each node and remains decentralized.
...
...