• Corpus ID: 53824194

Partitioned Variational Inference: A unified framework encompassing federated and continual learning

@article{Bui2018PartitionedVI,
  title={Partitioned Variational Inference: A unified framework encompassing federated and continual learning},
  author={Thang D. Bui and Cuong V Nguyen and Siddharth Swaroop and Richard E. Turner},
  journal={ArXiv},
  year={2018},
  volume={abs/1811.11206}
}
Variational inference (VI) has become the method of choice for fitting many modern probabilistic models. However, practitioners are faced with a fragmented literature that offers a bewildering array of algorithmic options. First, the variational family. Second, the granularity of the updates e.g. whether the updates are local to each data point and employ message passing or global. Third, the method of optimization (bespoke or blackbox, closed-form or stochastic updates, etc.). This paper… 
Differentially Private Federated Variational Inference
TLDR
It is shown that it is possible to learn moderately private logistic regression models in the federated setting that achieve similar performance to models trained non-privately on centralised data.
Federated Functional Variational Inference
TLDR
This work proposes FSVI, a method to train Bayesian neural networks in the federated setting that builds upon recent advances in functional variational inference and posits prior distributions directly in the function space of the network.
Bayesian Variational Federated Learning and Unlearning in Decentralized Networks
  • J. Gong, O. Simeone, Joonhyuk Kang
  • Computer Science
    2021 IEEE 22nd International Workshop on Signal Processing Advances in Wireless Communications (SPAWC)
  • 2021
TLDR
This paper develops federated variational inference (VI) solutions based on the decentralized solution of local free energy minimization problems within exponential-family models and on local gossip-driven communication on a decentralized network within a Bayesian framework.
Distributed Variational Inferenceand Privacy
TLDR
This dissertation extends Partitioned Variational Inference to support private federated learning using the concept of differential privacy and applies the data-point level DPPVI algorithm to Bayesian multi-dimensional linear regression models.
Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent
TLDR
Distributed Stein Variational Gradient Descent is shown to compare favorably to benchmark frequentist and Bayesian federated learning strategies in terms of accuracy and scalability with respect to the number of agents, while also providing well-calibrated, and hence trustworthy, predictions.
Forget-SVGD: Particle-Based Bayesian Federated Unlearning
TLDR
This paper proposes to leverage the flexibility of non-parametric Bayesian approximate inference to develop a novel Bayesian federated unlearning method, referred to as Forget-Stein Variational Gradient Descent (Forget-SVGD).
FedPop: A Bayesian Approach for Personalised Federated Learning
TLDR
This paper proposes a novel methodology coined FedPop by recasting personalised FL into the population modeling paradigm where clients’ models involve fixed common population parameters and random effects, aiming at explaining data heterogeneity.
Variational Federated Multi-Task Learning
TLDR
In VIRTUAL the federated network of the server and the clients is treated as a star-shaped Bayesian network, and learning is performed on the network using approximated variational inference, and it is shown that this method is effective on real-world federated datasets.
Modular Gaussian Processes for Transfer Learning
TLDR
This work develops a module-based method that having a dictionary of well fitted GPs, one could build ensemble GP models without revisiting any data, and exploits the augmentation of high-dimensional integral operators based on the Kullback-Leibler divergence between stochastic processes.
QLSD: Quantised Langevin stochastic dynamics for Bayesian federated learning
TLDR
This paper proposes a novel federated Markov Chain Monte Carlo algorithm, referred to as Quantised Langevin Stochastic Dynamics which may be seen as an extension to the FL setting ofStochastic Gradient Langevin Dynamics, which handles the communication bottleneck using gradient compression.
...
...

References

SHOWING 1-10 OF 106 REFERENCES
Monte Carlo Structured SVI for Non-Conjugate Models
TLDR
The resulting approach, Monte Carlo Structured SVI (MC-SSVI), significantly extends the scope of SVI, enabling large-scale learning in non-conjugate models and improves over previous work which used the much stronger mean field variational approximation.
Variational Message Passing
TLDR
Variational Message Passing is introduced, a general purpose algorithm for applying variational inference to Bayesian Networks and can be applied to very general class of conjugate-exponential models because it uses a factorised variational approximation.
On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes
TLDR
A substantial generalization of the literature on variational framework for learning inducing variables is given and a new proof of the result for infinite index sets is given which allows inducing points that are not data points and likelihoods that depend on all function values.
Stochastic Expectation Propagation
TLDR
Stochastic expectation propagation is presented, called SEP, that maintains a global posterior approximation but updates it in a local way (like EP), and is ideally suited to performing approximate Bayesian learning in the large model, large dataset setting.
Variational Inference with Normalizing Flows
TLDR
It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.
Distributed Bayesian Learning with Stochastic Natural Gradient Expectation Propagation and the Posterior Server
TLDR
Stochastic natural gradient expectation propagation is proposed, a novel alternative to expectation propagation, a popular variational inference algorithm, and a novel architecture for distributed Bayesian learning which is called the posterior server.
Auto-Encoding Variational Bayes
TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Monte Carlo Structured SVI for Two-Level Non-Conjugate Models
TLDR
The resulting approach, Monte Carlo Structured SVI (MC-SSVI), significantly extends the scope of SVI, enabling large-scale learning in non-conjugate models and a hybrid algorithm is proposed, using both standard and natural gradients, which is shown to improve stability and convergence.
Streaming Sparse Gaussian Process Approximations
TLDR
A new principled framework for deploying Gaussian process probabilistic models in the streaming setting is developed, providing methods for learning hyperparameters and optimising pseudo-input locations.
Hierarchical Variational Models
TLDR
This work develops hierarchical variational models (HVMs), which augment a variational approximation with a prior on its parameters, which allows it to capture complex structure for both discrete and continuous latent variables.
...
...