Scaling Neuroscience Research Using Federated Learning

  title={Scaling Neuroscience Research Using Federated Learning},
  author={Dimitris Stripelis and J. Ambite and Pradeep Lam and Paul M. Thompson},
  journal={2021 IEEE 18th International Symposium on Biomedical Imaging (ISBI)},
The amount of biomedical data continues to grow rapidly. However, the ability to analyze these data is limited due to privacy and regulatory concerns. Machine learning approaches that require data to be copied to a single location are hampered by the challenges of data sharing. Federated Learning is a promising approach to learn a joint model over data silos. This architecture does not share any subject data across sites, only aggregated parameters, often in encrypted environments, thus… 

Figures and Tables from this paper

Secure Federated Learning for Neuroimaging

A Secure Federated Learning architecture, MetisFL, is presented, which enables distributed training of neural networks over multiple data sources without sharing data, and is demonstrated in neuroimaging.

Secure neuroimaging analysis using federated learning with homomorphic encryption

This work proposes a framework for secure FL using fullyhomomorphic encryption (FHE), and uses the CKKS construction, an approximate, floating point compatible scheme that benefits from ciphertext packing and rescaling.

Towards Sparsified Federated Neuroimaging Models via Weight Pruning

It is demonstrated that models with high sparsity are less susceptible to membership inference attacks, a type of privacy attack, and proposed FedSparsify, which performs model pruning during federated training, is proposed.

Membership Inference Attacks on Deep Regression Models for Neuroimaging

It is demonstrated that allowing access to parameters may leak private information even if data is never directly shared, and feasible attacks on brain age prediction models (deep learning models that predict a person’s age from their brain MRI scan) are demonstrated.

Semi-Synchronous Federated Learning for Energy-Efficient Training and Accelerated Convergence in Cross-Silo Settings

A novel energy-efficient Semi-Synchronous Federated Learning protocol that mixes local models periodically with minimal idle time and fast convergence is introduced that significantly outperforms previous work in data and computationally heterogeneous environments.

Federated Learning Meets Natural Language Processing: A Survey

This survey discusses major challenges in federated natural language processing, including the algorithm challenges, system challenges as well as the privacy issues, and provides a critical review of the existing Federated NLP evaluation methods and tools.

Applications of Federated Learning; Taxonomy, Challenges, and Research Trends

The areas of medical AI, IoT, edge systems, and the autonomous industry can adapt the FL in many of its sub-domains; however, the challenges these domains can encounter are statistical heterogeneity, system heterogeneity, data imbalance, resource allocation, and privacy.

Refacing Defaced MRI with PixelCNN

This work simulated the rebuild process and showed convincing results, and hopes the model can generate the face entirely depending on the information from the brain, which is a purely supervised facial reconstruction.

Federated Morphometry Feature Selection for Hippocampal Morphometry Associated Beta-Amyloid and Tau Pathology

This work proposes a novel framework, Federated Morphometry Feature Selection (FMFS) model, to examine subtle aspects of hippocampal morphometry that are associated with Aβ/tau burden in the brain, measured using positron emission tomography (PET).

Accelerating model synchronization for distributed machine learning in an optical wide area network

This paper uses an aggregation tree for each Geo-DML training job, which helps to reduce model synchronization communication overhead across the WAN, and proposes two efficient algorithms to accelerate GMS for Geo- DML: MOptree, a model-based algorithm for single job scheduling, and MMOptree for multipleJob scheduling, aiming to reconfigure the Wan topology and trees by reassigning wavelengths on each fiber.



BrainTorrent: A Peer-to-Peer Environment for Decentralized Federated Learning

The overall effectiveness of FL for the challenging task of whole brain segmentation is demonstrated and it is observed that the proposed server-less BrainTorrent approach does not only outperform the traditional server-based one but reaches a similar performance to a model trained on pooled data.

Federated Learning in Distributed Medical Databases: Meta-Analysis of Large-Scale Subcortical Brain Data

A federated learning framework for securely accessing and meta-analyzing any biomedical data without sharing individual information is proposed and applied to multi-centric, multi-database studies including ADNI, PPMI, MIRIAD and UK Biobank, showing the potential of the approach for further applications in distributed analysis ofMulti-centric cohorts.

Semi-Synchronous Federated Learning

A novel Semi-Synchronous Federated Learning protocol that mixes local models periodically with minimal idle time and fast convergence is introduced that significantly outperforms previous work in data and computationally heterogeneous environments.

Privacy-preserving Federated Brain Tumour Segmentation

The feasibility of applying differential-privacy techniques to protect the patient data in a federated learning setup for brain tumour segmentation on the BraTS dataset is investigated and there is a trade-off between model performance and privacy protection costs.

Fed-BioMed: A General Open-Source Frontend Framework for Federated Learning in Healthcare

This work proposes an open-source federated learning frontend framework based on a general architecture accommodating for different models and optimization methods, and presents software components for clients and central node, and illustrates the workflow for deploying learning models.

The future of digital health with federated learning

This paper considers key factors contributing to this issue, explores how federated learning (FL) may provide a solution for the future of digital health and highlights the challenges and considerations that need to be addressed.

COINSTAC: A Privacy Enabled Model and Prototype for Leveraging and Processing Decentralized Brain Imaging Data

A dynamic, decentralized platform for large scale analyses called the Collaborative Informatics and Neuroimaging Suite Toolkit for Anonymous Computation (COINSTAC), which enables access to the many currently unavailable data sets, a user friendly privacy enabled interface for decentralized analysis, and a powerful solution that complements existing data sharing solutions.

Federated Machine Learning: Concept and Applications

This work proposes building data networks among organizations based on federated mechanisms as an effective solution to allow knowledge to be shared without compromising user privacy.

Federated Optimization in Heterogeneous Networks

This work introduces a framework, FedProx, to tackle heterogeneity in federated networks, and provides convergence guarantees for this framework when learning over data from non-identical distributions (statistical heterogeneity), and while adhering to device-level systems constraints by allowing each participating device to perform a variable amount of work.