Corpus ID: 212644577

Joint Parameter-and-Bandwidth Allocation for Improving the Efficiency of Partitioned Edge Learning

@article{Wen2020JointPA,
  title={Joint Parameter-and-Bandwidth Allocation for Improving the Efficiency of Partitioned Edge Learning},
  author={Dingzhu Wen and Mehdi Bennis and Kaibin Huang},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.04544}
}
  • Dingzhu Wen, Mehdi Bennis, Kaibin Huang
  • Published 2020
  • Mathematics, Computer Science
  • ArXiv
  • To leverage data and computation capabilities of mobile devices, machine learning algorithms are deployed at the network edge for training artificial intelligence (AI) models, resulting in the new paradigm of edge learning. In this paper, we consider the framework of partitioned edge learning for iteratively training a large-scale model using many resource-constrained devices (called workers). To this end, in each iteration, the model is dynamically partitioned into parametric blocks, which are… CONTINUE READING

    Citations

    Publications citing this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 28 REFERENCES

    Hierarchical Federated Learning ACROSS Heterogeneous Cellular Networks

    VIEW 10 EXCERPTS
    HIGHLY INFLUENTIAL

    Distributed optimization of deeply nested systems

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    Accelerating DNN Training in Wireless Federated Edge Learning System

    VIEW 1 EXCERPT

    Broadband Analog Aggregation for Low-Latency Federated Edge Learning

    VIEW 1 EXCERPT

    Coordinate descent algorithms

    VIEW 2 EXCERPTS

    Device Scheduling with Fast Convergence for Wireless Federated Learning

    Efficient Training of Very Deep Neural Networks for Supervised Hashing

    VIEW 2 EXCERPTS