Corpus ID: 237513534

Federated Learning of Molecular Properties in a Heterogeneous Setting

@article{Zhu2021FederatedLO,
  title={Federated Learning of Molecular Properties in a Heterogeneous Setting},
  author={Wei Zhu and Andrew White and Jiebo Luo},
  journal={ArXiv},
  year={2021},
  volume={abs/2109.07258}
}
  • Wei Zhu, Andrew White, Jiebo Luo
  • Published 15 September 2021
  • Computer Science, Physics
  • ArXiv
Chemistry research has both high material and computational costs to conduct experiments. Institutions thus consider chemical data to be valuable and there have been few efforts to construct large public datasets for machine learning. Another challenge is that different intuitions are interested in different classes of molecules, creating heterogeneous data that cannot be easily joined by conventional distributed training. In this work, we introduce federated heterogeneous molecular learning to… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 59 REFERENCES
Data-Free Knowledge Distillation for Heterogeneous Federated Learning
TLDR
Empirical studies powered by theoretical implications show that, the proposed data-free knowledge distillation approach facilitates FL with better generalization performance using fewer communication rounds, compared with the state-of-the-art.
MoleculeNet: A Benchmark for Molecular Machine Learning
TLDR
MoleculeNet benchmarks demonstrate that learnable representations are powerful tools for molecular machine learning and broadly offer the best performance, however, this result comes with caveats.
FedGraphNN: A Federated Learning System and Benchmark for Graph Neural Networks
TLDR
FedGraphNN is an open research federated learning system and a benchmark to facilitate GNN-based FL research, built on a unified formulation of federated GNNs and supports commonly used datasets, GNN models, FL algorithms, and flexible APIs.
Federated Knowledge Distillation
TLDR
The goal of this chapter is to provide a deep understanding of FD while demonstrating its communication efficiency and applicability to a variety of tasks and to demystify the operational principle of FD.
FedMD: Heterogenous Federated Learning via Model Distillation
TLDR
This work uses transfer learning and knowledge distillation to develop a universal framework that enables federated learning when each agent owns not only their private data, but also uniquely designed models.
Federated Meta-Learning with Fast Convergence and Efficient Communication
TLDR
This work proposes a federated meta-learning framework FedMeta, where a parameterized algorithm (or meta-learner) is shared, instead of a global model in previous approaches, and achieves a reduction in required communication cost and increase in accuracy as compared to Federated Averaging.
FedBE: Making Bayesian Model Ensemble Applicable to Federated Learning
TLDR
A novel aggregation algorithm named FEDBE is proposed, which takes a Bayesian inference perspective by sampling higher-quality global models and combining them via Bayesian model Ensemble, leading to much robust aggregation.
Fed-Focal Loss for imbalanced data classification in Federated Learning
TLDR
This paper proposes to address the class imbalance by reshaping cross-entropy loss such that it down-weights the loss assigned to well-classified examples along the lines of focal loss, and uses a tunable sampling framework to improve its robustness.
On the Convergence of Federated Optimization in Heterogeneous Networks
TLDR
This work proposes and introduces \fedprox, which is similar in spirit to \fedavg, but more amenable to theoretical analysis, and describes the convergence of \fed Prox under a novel \textit{device similarity} assumption.
Agnostic Federated Learning
TLDR
This work proposes a new framework of agnostic federated learning, where the centralized model is optimized for any target distribution formed by a mixture of the client distributions, and shows that this framework naturally yields a notion of fairness.
...
1
2
3
4
5
...