# Prospects of federated machine learning in fluid dynamics

@article{San2022ProspectsOF,
title={Prospects of federated machine learning in fluid dynamics},
author={Omer San and Suraj Pawar and Adil Rasheed},
journal={ArXiv},
year={2022},
volume={abs/2208.07017}
}
• Published 15 August 2022
• Computer Science
• ArXiv
Physics-based models have been mainstream in fluid dynamics for developing predictive models. In recent years, machine learning has offered a renaissance to the fluid community due to the rapid developments in data science, processing units, neural network based technologies, and sensor adaptations. So far in many applications in fluid dynamics, machine learning approaches have been mostly focused on a standard process that requires centralizing the training data on a designated machine or in a…

## References

SHOWING 1-10 OF 37 REFERENCES

• Computer Science
MLSys
• 2020
This work introduces a framework, FedProx, to tackle heterogeneity in federated networks, and provides convergence guarantees for this framework when learning over data from non-identical distributions (statistical heterogeneity), and while adhering to device-level systems constraints by allowing each participating device to perform a variable amount of work.
• Computer Science
Physics of Fluids
• 2021
A nonlinear proper orthogonal decomposition (POD) framework, which is an end-to-end Galerkin-free model combining autoencoders with long short-term memory networks for dynamics, which not only improves the accuracy, but also significantly reduces the computational cost of training and testing.
• Computer Science
Physical review. E
• 2020
A data-driven framework is developed to represent chaotic dynamics on an inertial manifold and applied to solutions of the Kuramoto-Sivashinsky equation, reproducing very well key dynamic and statistical features of the attractor.
• Computer Science
ArXiv
• 2020
A personalized variant of the well-known Federated Averaging algorithm is studied and its performance is characterized by the closeness of underlying distributions of user data, measured in terms of distribution distances such as Total Variation and 1-Wasserstein metric.
• Computer Science
ArXiv
• 2020
Information theoretically, it is proved that the mixture of local and global models can reduce the generalization error and a communication-reduced bilevel optimization method is proposed, which reduces the communication rounds to $O(\sqrt{T})$ and can achieve a convergence rate of $O(1/T)$ with some residual error.
• Computer Science
ICLR
• 2020
This work proposes q-Fair Federated Learning (q-FFL), a novel optimization objective inspired by fair resource allocation in wireless networks that encourages a more fair accuracy distribution across devices in federated networks.
• Computer Science
Physics of Fluids
• 2019
A long short-term memory (LSTM) neural network architecture together with a principal interval decomposition (PID) framework as an enabler to account for localized modal deformation, which is a key element in accurate reduced order modeling of convective flows.
• Computer Science
AISTATS
• 2017
This work presents a practical method for the federated learning of deep networks based on iterative model averaging, and conducts an extensive empirical evaluation, considering five different model architectures and four datasets.
• Computer Science
IEEE Access
• 2020
This work reviews the recent status of methodologies and techniques related to the construction of digital twins mostly from a modeling perspective to provide a detailed coverage of the current challenges and enabling technologies along with recommendations and reflections for various stakeholders.