• Corpus ID: 244714222

Dynamic Network-Assisted D2D-Aided Coded Distributed Learning

@inproceedings{Zeulin2021DynamicND,
  title={Dynamic Network-Assisted D2D-Aided Coded Distributed Learning},
  author={Nikita Zeulin and Olga Galinina and Nageen Himayat and Sergey D. Andreev and Robert W. Heath},
  year={2021}
}
Today, various machine learning (ML) applications offer continuous data processing and real-time data analytics at the edge of a wireless network. Distributed real-time ML solutions are highly sensitive to the so-called straggler effect caused by resource heterogeneity and alleviated by various computation offloading mechanisms that seriously challenge the communication efficiency, especially in large-scale scenarios. To decrease the communication overhead, we rely on device-to-device (D2D… 

References

SHOWING 1-10 OF 47 REFERENCES
D2D-Assisted Federated Learning in Mobile Edge Computing Networks
TLDR
The results show that D2D-FedAvg lowers the communication cost relative to the typical Federated Averaging (FedAvg) in cellular networks as the number of users is increased, while keeping the same learning accuracy with FedAvg across the board.
Attention-Weighted Federated Deep Reinforcement Learning for Device-to-Device Assisted Heterogeneous Collaborative Edge Caching
TLDR
An attention-weighted federated deep reinforcement learning (AWFDRL) model that uses federated learning to improve the training efficiency of the Q-learning network by considering the limited computing and storage capacity, and incorporates an attention mechanism to optimize the aggregation weights to avoid the imbalance of local model quality is designed.
Broadband Analog Aggregation for Low-Latency Federated Edge Learning
TLDR
This work designs a low-latency multi-access scheme for edge learning based on a popular privacy-preserving framework, federated edge learning (FEEL), and derives two tradeoffs between communication-and-learning metrics, which are useful for network planning and optimization.
Coded Computing for Low-Latency Federated Learning Over Wireless Edge Networks
TLDR
This work proposes a novel coded computing framework, CodedFedL, that injects structured coding redundancy into federated learning for mitigating stragglers and speeding up the training procedure.
Federated Learning in Mobile Edge Networks: A Comprehensive Survey
TLDR
In a large-scale and complex mobile edge network, heterogeneous devices with varying constraints are involved, this raises challenges of communication costs, resource allocation, and privacy and security in the implementation of FL at scale.
Federated Learning via Over-the-Air Computation
TLDR
A novel over-the-air computation based approach for fast global model aggregation via exploring the superposition property of a wireless multiple-access channel and providing a difference-of-convex-functions (DC) representation for the sparse and low-rank function to enhance sparsity and accurately detect the fixed-rank constraint in the procedure of device selection.
Coded Computing for Distributed Machine Learning in Wireless Edge Network
TLDR
A coded computation framework, which utilizes statistical knowledge of resource heterogeneity to determine optimal encoding and load balancing of training data using Random Linear codes, while avoiding an explicit step for decoding gradients is proposed.
Federated Learning: Strategies for Improving Communication Efficiency
TLDR
Two ways to reduce the uplink communication costs are proposed: structured updates, where the user directly learns an update from a restricted space parametrized using a smaller number of variables, e.g. either low-rank or a random mask; and sketched updates, which learn a full model update and then compress it using a combination of quantization, random rotations, and subsampling.
CMFL: Mitigating Communication Overhead for Federated Learning
  • Luping Wang, Wei Wang, Bo Li
  • Computer Science
    2019 IEEE 39th International Conference on Distributed Computing Systems (ICDCS)
  • 2019
TLDR
Communication-Mitigated Federated Learning provides clients with feedback information regarding the global tendency of model updating and can substantially reduce the communication overhead while still guaranteeing the learning convergence.
The Disruptions of 5G on Data-Driven Technologies and Applications
TLDR
5G will make the world even more densely and closely connected and what the authors have experienced in 4G connectivity will pale in comparison to the vast amounts of possibilities engendered by 5G.
...
1
2
3
4
5
...