• Corpus ID: 231839418

DEAL: Decremental Energy-Aware Learning in a Federated System

@article{Zou2021DEALDE,
  title={DEAL: Decremental Energy-Aware Learning in a Federated System},
  author={Wenting Zou and Li Li and Zichen Xu and Chengzhong Xu},
  journal={ArXiv},
  year={2021},
  volume={abs/2102.03051}
}
Federated learning struggles with their heavy energy footprints on battery-powered devices. The learning process keeps all devices awake while draining expensive battery power to train a shared model collaboratively, yet it may still leak sensitive personal information. Traditional energy management techniques in system kernel mode can force the training device entering low power states, but it may violate the SLO of the collaborative learning. To address the conflict between learning SLO and… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 40 REFERENCES

Exploring federated learning on battery-powered devices

This paper explores the possibility of enabling federated learning on many battery-powered devices using battery power, and proposes a two-layered strategy to process the learning on batteries with a reasonable tradeoff.

Efficient and Private Federated Learning using TEE

This work proposes an efficient and private federated learning framework at edge computing that enhances the protection of the trusted part of models using data obliviousness and uses the differential privacy in stochastic gradient descent (SGD) algorithm at the CA’s layers and between the communication of the TA’'s layers and the CA's layers.

CAPMAN: Cooling and Active Power Management in big.LITTLE Battery Supported Devices

A system framework, called CAPMAN, which supports joint optimization of cooling and active power management in smartphones and can achieve 114% longer service time under skewed loads, and shows 55% performance gain and 53% less energy use on average.

Adaptive Federated Learning in Resource Constrained Edge Computing Systems

This paper analyzes the convergence bound of distributed gradient descent from a theoretical point of view, and proposes a control algorithm that determines the best tradeoff between local update and global parameter aggregation to minimize the loss function under a given resource budget.

Federated Learning: Collaborative Machine Learning without Centralized Training Data

Federated learning allows several actors to collaborate on the development of a single, robust machine learning model without sharing data, allowing crucial issues such as data privacy, data security, data access rights, and access to heterogeneous data to be addressed.

Asynchronous Federated Learning for Geospatial Applications

This work presents a new asynchronous federated-learning algorithm (‘asynchronous federated learning’) and studies its convergence rate when distributed across many edge devices, with hard data constraints, relative to training the same model on a single device.

A survey on security and privacy of federated learning

Federated Multi-Task Learning

This work shows that multi-task learning is naturally suited to handle the statistical challenges of this setting, and proposes a novel systems-aware optimization method, MOCHA, that is robust to practical systems issues.

Where is the energy spent inside my app?: fine grained energy accounting on smartphones with Eprof

Bundles are proposed, a new accounting presentation of app I/O energy, which helps the developer to quickly understand and optimize the energy drain of her app.

Federated Learning: Challenges, Methods, and Future Directions

The unique characteristics and challenges of federated learning are discussed, a broad overview of current approaches are provided, and several directions of future work that are relevant to a wide range of research communities are outlined.