Corpus ID: 216553597

SplitFed: When Federated Learning Meets Split Learning

@article{Thapa2020SplitFedWF,
  title={SplitFed: When Federated Learning Meets Split Learning},
  author={Chandra Thapa and Pathum Chamikara Mahawaga Arachchige and Seyit Ahmet Çamtepe},
  journal={ArXiv},
  year={2020},
  volume={abs/2004.12088}
}
Federated learning (FL) and split learning (SL) are two recent distributed machine learning (ML) approaches that have gained attention due to their inherent privacy-preserving capabilities. Both approaches follow a model-to-data scenario, in that an ML model is sent to clients for network training and testing. However, FL and SL show contrasting strengths and weaknesses. For example, while FL performs faster than SL due to its parallel client-side model generation strategy, SL provides better… Expand
Advancements of federated learning towards privacy preservation: from federated learning to split learning
TLDR
This chapter is designed to provide extensive coverage in SL and its variants, including fundamentals, existing findings, integration with privacy measures such as differential privacy, open problems, and code implementation. Expand
End-to-End Evaluation of Federated Learning and Split Learning for Internet of Things
TLDR
This work is the first attempt to provide empirical comparisons of FL and SplitNN in real-world IoT settings in terms of learning performance and device implementation overhead and demonstrates that neither FL or SplitNN can be applied to a heavy model, e.g., with several million parameters, on resource-constrained IoT devices because its training cost would be too expensive for such devices. Expand
Vulnerability Due to Training Order in Split Learning
TLDR
It is demonstrated that the model trained using the data of all clients does not perform well on the client’s data which was considered earliest in a round for training the model, and the SplitFedv3 algorithm mitigates this problem while still leveraging the privacy benefits provided by split learning. Expand
Decentralised Learning in Federated Deployment Environments
TLDR
This survey provides a detailed and up-to-date overview of the most recent contributions available in the state-of-the-art decentralised learning literature, including solutions for privacy, communication efficiency, non-IIDness, device heterogeneity, and poisoning defense. Expand
Comparison of Privacy-Preserving Distributed Deep Learning Methods in Healthcare
TLDR
This paper uses federated learning, split learning, and SplitFed to develop binary classification models for detecting tuberculosis from chest X-rays and compares them in terms of classification performance, communication and computational costs, and training time. Expand
PyVertical: A Vertical Federated Learning Framework for Multi-headed SplitNN
TLDR
PyVertical, a framework supporting vertical federated learning using split neural networks, is introduced and the training of a simple dual-headed split neural network for a MNIST classification task is presented. Expand
Federated Learning-based Active Authentication on Mobile Devices
TLDR
This work proposes a novel method that is able to tackle heterogeneous/non-IID distribution of data in FAA and shows that such approach performs better than state-of-the-art one-class based FAA methods and is also able to outperform traditional FL/SL methods. Expand
NoPeek: Information leakage reduction to share activations in distributed deep learning
TLDR
This work demonstrates how minimizing distance correlation between raw data and intermediary representations reduces leakage of sensitive raw data patterns across client communications while maintaining model accuracy in distributed deep learning services. Expand
AdaSplit: Adaptive Trade-offs for Resource-constrained Distributed Deep Learning
  • 2021
Distributed deep learning frameworks like Federated learning (FL) and its variants are enabling personalized experiences across a wide range of web clients and mobile/IoT devices. However, theseExpand
Unleashing the Tiger: Inference Attacks on Split Learning
TLDR
This paper exposes vulnerabilities of the split learning protocol and demonstrates its inherent insecurity by introducing general attack strategies targeting the reconstruction of clients' private training sets and extending previously devised attacks for Federated Learning. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 46 REFERENCES
End-to-End Evaluation of Federated Learning and Split Learning for Internet of Things
TLDR
This work is the first attempt to provide empirical comparisons of FL and SplitNN in real-world IoT settings in terms of learning performance and device implementation overhead and demonstrates that neither FL or SplitNN can be applied to a heavy model, e.g., with several million parameters, on resource-constrained IoT devices because its training cost would be too expensive for such devices. Expand
Detailed comparison of communication efficiency of split learning and federated learning
TLDR
This work considers various practical scenarios of distributed learning setup and juxtapose the two methods under various real-life scenarios and shows useful settings under which each method outperforms the other in terms of communication efficiency. Expand
Federated Optimization: Distributed Optimization Beyond the Datacenter
We introduce a new and increasingly relevant setting for distributed optimization in machine learning, where the data defining the optimization are distributed (unevenly) over an extremely largeExpand
Split learning for health: Distributed deep learning without sharing raw patient data
TLDR
This paper compares performance and resource efficiency trade-offs of splitNN and other distributed deep learning methods like federated learning, large batch synchronous stochastic gradient descent and show highly encouraging results for splitNN. Expand
Can We Use Split Learning on 1D CNN Models for Privacy Preserving Training?
TLDR
It is still unclear whether the split learning can be applied to other deep learning models, in particular, 1D CNN, but it is believed to be a promising approach to protect the client's raw data. Expand
Communication-Efficient Learning of Deep Networks from Decentralized Data
TLDR
This work presents a practical method for the federated learning of deep networks based on iterative model averaging, and conducts an extensive empirical evaluation, considering five different model architectures and four datasets. Expand
Towards Federated Learning at Scale: System Design
TLDR
A scalable production system for Federated Learning in the domain of mobile devices, based on TensorFlow is built, describing the resulting high-level design, and sketch some of the challenges and their solutions. Expand
Distributed learning of deep neural network over multiple agents
TLDR
This work proposes a new technique to train deep neural networks over several data sources in a distributed fashion, which paves the way for distributed training ofDeep neural networks in data sensitive applications when raw data may not be shared directly. Expand
DIANNE: a modular framework for designing, training and deploying deep neural networks on heterogeneous distributed infrastructure
TLDR
The DIANNE framework is proposed as an all-in-one solution for deep learning, enabling data and model parallelism though a modular design, offloading to local compute power, and the ability to abstract between simulation and real environment. Expand
Local Differential Privacy for Deep Learning
TLDR
A new local differentially private (LDP) algorithm named LATENT is proposed that redesigns the training process and enables a data owner to add a randomization layer before data leave the data owners’ devices and reach a potentially untrusted machine learning service. Expand
...
1
2
3
4
5
...