• Corpus ID: 239616293

WebFed: Cross-platform Federated Learning Framework Based on Web Browser with Local Differential Privacy

@article{Lian2021WebFedCF,
  title={WebFed: Cross-platform Federated Learning Framework Based on Web Browser with Local Differential Privacy},
  author={Zhuotao Lian and Qinglin Yang and Qingkui Zeng and Chunhua Su},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.11646}
}
For data isolated islands and privacy issues, federated learning has been extensively invoking much interest since it allows clients to collaborate on training a global model using their local data without sharing any with a third party. However, the existing federated learning frameworks always need sophisticated condition configurations (e.g., sophisticated driver configuration of standalone graphics card like NVIDIA, compile environment) that bring much inconvenience for large-scale… 

Figures from this paper

References

SHOWING 1-10 OF 22 REFERENCES
Asynchronous Federated Learning with Differential Privacy for Edge Intelligence
TLDR
A multi-stage adjustable private algorithm (MAPA) is proposed to improve the trade-off between model utility and privacy by dynamically adjusting both the noise scale and the learning rate and it is demonstrated that MAPA significantly improves both the model accuracy and convergence speed with sufficient privacy guarantee.
Federated Learning With Differential Privacy: Algorithms and Performance Analysis
  • Kang Wei, Jun Li, +6 authors H. Poor
  • Mathematics, Computer Science
    IEEE Transactions on Information Forensics and Security
  • 2020
TLDR
A novel framework based on the concept of differential privacy, in which artificial noise is added to parameters at the clients’ side before aggregating, namely, noising before model aggregation FL (NbAFL), is proposed and an optimal convergence bound is found that achieves the best convergence performance at a fixed privacy level.
Comprehensive Privacy Analysis of Deep Learning: Passive and Active White-box Inference Attacks against Centralized and Federated Learning
TLDR
The reasons why deep learning models may leak information about their training data are investigated and new algorithms tailored to the white-box setting are designed by exploiting the privacy vulnerabilities of the stochastic gradient descent algorithm, which is the algorithm used to train deep neural networks.
Moving Deep Learning into Web Browser: How Far Can We Go?
TLDR
This paper conducts the first empirical study of deep learning in browsers, surveying 7 most popular JavaScript-based deep learning frameworks, investigating to what extent deep learning tasks have been supported in browsers so far, and measuring the performance of different frameworks when running differentDeep learning tasks.
Federated Learning: Challenges, Methods, and Future Directions
TLDR
The unique characteristics and challenges of federated learning are discussed, a broad overview of current approaches are provided, and several directions of future work that are relevant to a wide range of research communities are outlined.
Open source column: Deep learning in the browser
Having already discussed MatConvNet and Keras, let us continue with an open source framework for deep learning, which takes a new and interesting approach. TensorFlow.js is not only providing deep
FFD: A Federated Learning Based Method for Credit Card Fraud Detection
TLDR
This paper proposes a framework to train a fraud detection model using behavior features with federated learning and evaluates the performance of the proposed framework with FFD (Federated learning for Fraud Detection) on a large scale dataset of real-world credit card transactions.
Collecting and Analyzing Multidimensional Data with Local Differential Privacy
  • N. Wang, Xiaokui Xiao, +5 authors Ge Yu
  • Computer Science, Geology
    2019 IEEE 35th International Conference on Data Engineering (ICDE)
  • 2019
TLDR
Novel LDP mechanisms for collecting a numeric attribute, whose accuracy is at least no worse (and usually better) than existing solutions in terms of worst-case noise variance are proposed, and extended to multidimensional data that can contain both numeric and categorical attributes, where they always outperform existing solutions regarding worst- case noise variance.
Progressive Web Apps: The Possible Web-native Unifier for Mobile Development
TLDR
It is argued for progressive web apps as a possibly unifying technology for web apps and native apps, and two cross-platform mobile apps and one Progressive Web App are developed for comparison purposes, and provided in an open source repository for results’ validity verification.
MLitB: machine learning in the browser
With few exceptions, the field of Machine Learning (ML) research has largely ignored the browser as a computational engine. Beyond an educational resource for ML, the browser has vast potential to
...
1
2
3
...