Machine Learning at the Network Edge: A Survey
@article{Murshed2019MachineLA, title={Machine Learning at the Network Edge: A Survey}, author={M. G. Sarwar Murshed and Chris Murphy and Daqing Hou and Nazar Khan and Ganesh Ananthanarayanan and Faraz Hussain}, journal={ACM Computing Surveys (CSUR)}, year={2019}, volume={54}, pages={1 - 37} }
Resource-constrained IoT devices, such as sensors and actuators, have become ubiquitous in recent years. This has led to the generation of large quantities of data in real-time, which is an appealing target for AI systems. However, deploying machine learning models on such end-devices is nearly impossible. A typical solution involves offloading data to external computing systems (such as cloud servers) for further processing but this worsens latency, leads to increased communication costs, and…
140 Citations
Multi-Component Optimization and Efficient Deployment of Neural-Networks on Resource-Constrained IoT Hardware
- Computer ScienceArXiv
- 2022
This paper presents an end-to-end multi-component model optimization sequence that can be applied to any state-of-the-art models trained for anomaly detection, predictive maintenance, robotics, voice recognition, and machine vision, and open-source its implementation.
Practice of Applied Edge Analytics in Intelligent Learning Framework
- Computer Science2020 21st International Arab Conference on Information Technology (ACIT)
- 2020
The network of edge computing and its variance from cloud computing, edge architecture, and diverse applications of machine learning algorithms and deep learning framework deployed at the edge network for intelligent analytics are described.
Machine and Deep Learning for Resource Allocation in Multi-Access Edge Computing: A Survey
- Computer ScienceIEEE Communications Surveys & Tutorials
- 2022
This paper presents a comprehensive survey of ML/DL-based RA mechanisms in MEC, and provides an in-depth survey of recent works that used ML/ DL methods for RA in M EC from three aspects.
Sparta: Heat-Budget-Based Scheduling Framework on IoT Edge Systems
- Computer ScienceEDGE
- 2021
A heat-budget-based scheduling system, called Sparta, which leverages dynamic voltage and frequency scaling (DVFS) to adaptively control CPU temperature and is able to maintain CPU temperature below the threshold 94% of the time while facilitating improvements in execution time by 1.04x over competitive approaches.
Communication-Efficient Edge AI: Algorithms and Systems
- Computer ScienceIEEE Communications Surveys & Tutorials
- 2020
A comprehensive survey of the recent developments in various techniques for overcoming key communication challenges in edge AI systems is presented, and communication-efficient techniques are introduced from both algorithmic and system perspectives for training and inference tasks at the network edge.
LOW LATENCY DEEP LEARNING INFERENCE MODEL FOR DISTRIBUTED INTELLIGENT IOT EDGE CLUSTERS
- Computer ScienceIEEE Access
- 2021
This work deploys a convolutional neural network (CNN) on resource-constraint IoT devices to make them intelligent and realistic and proposes decentralized heterogeneous edge clusters deployed with an optimized pre-trained yolov2 model.
Machine Learning at Resource Constraint Edge Device Using Bonsai Algorithm
- Computer Science2020 Third International Conference on Advances in Electronics, Computers and Communications (ICAECC)
- 2020
The experiment is conducted with publicly available dataset with Bonsai algorithm, implemented in Linux environment with core is processor in python 2.7 and achieved 92% accuracy with model size of 6.25KB, which can be easily deployed on resource constraint IoT devices.
Making distributed edge machine learning for resource-constrained communities and environments smarter: contexts and challenges
- Computer ScienceJournal of Reliable Intelligent Environments
- 2022
This paper analyzes representative real-world business scenarios for edge ML solutions and their contexts in resource-constrained communities and environments and identifies and map the key distinguished contexts of distributed edge ML.
Pervasive AI for IoT Applications: A Survey on Resource-Efficient Distributed Artificial Intelligence
- Computer ScienceIEEE Communications Surveys & Tutorials
- 2022
A comprehensive survey of the recent techniques and strategies developed to overcome resource challenges in pervasive AI systems, and an overview of pervasive computing, its architecture, and its intersection with artificial intelligence.
Machine Learning in Resource-Scarce Embedded Systems, FPGAs, and End-Devices: A Survey
- Computer ScienceElectronics
- 2019
A study about the optimizations, algorithms, and platforms used to implement such models into the network’s end, where highly resource-scarce microcontroller units (MCUs) are found.
References
SHOWING 1-10 OF 183 REFERENCES
Federated Learning in Mobile Edge Networks: A Comprehensive Survey
- Computer ScienceIEEE Communications Surveys & Tutorials
- 2020
In a large-scale and complex mobile edge network, heterogeneous devices with varying constraints are involved, this raises challenges of communication costs, resource allocation, and privacy and security in the implementation of FL at scale.
Deep Learning at the Edge
- Computer Science2018 International Conference on Computational Science and Computational Intelligence (CSCI)
- 2018
One of the most widely used machine learning methods, namely, Deep Learning (DL), is discussed and a short survey on the recent approaches used to map DL onto the edge computing paradigm is offered.
Wireless Network Intelligence at the Edge
- Computer ScienceProceedings of the IEEE
- 2019
In a first of its kind, this article explores the key building blocks of edge ML, different neural network architectural splits and their inherent tradeoffs, as well as theoretical and technical enablers stemming from a wide range of mathematical disciplines.
When Edge Meets Learning: Adaptive Control for Resource-Constrained Distributed Machine Learning
- Computer ScienceIEEE INFOCOM 2018 - IEEE Conference on Computer Communications
- 2018
This paper analyzes the convergence rate of distributed gradient descent from a theoretical point of view, and proposes a control algorithm that determines the best trade-off between local update and global parameter aggregation to minimize the loss function under a given resource budget.
Edge Machine Learning: Enabling Smart Internet of Things Applications
- Computer ScienceBig Data Cogn. Comput.
- 2018
A step forward has been taken to understand the feasibility of running machine learning algorithms, both training and inference, on a Raspberry Pi, an embedded version of the Android operating system designed for IoT device development.
DeepThings: Distributed Adaptive Deep Learning Inference on Resource-Constrained IoT Edge Clusters
- Computer ScienceIEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
- 2018
DeepThings is proposed, a framework for adaptively distributed execution of CNN-based inference applications on tightly resource-constrained IoT edge clusters that employs a scalable Fused Tile Partitioning of convolutional layers to minimize memory footprint while exposing parallelism.
Image classification on IoT edge devices: profiling and modeling
- Computer ScienceCluster Computing
- 2019
The results indicate that the random forest model outperforms the two former algorithms, with an R-squared value of 0.95 and 0.79, and served as a feature extraction mechanism which enabled us to identify which predictor variables influenced the authors' model the most.
Fully Distributed Deep Learning Inference on Resource-Constrained Edge Devices
- Computer ScienceSAMOS
- 2019
This paper jointly optimize memory, computation and communication demands for distributed execution of complete neural networks covering all layers through techniques that combine both feature and weight partitioning with a communication-aware layer fusion approach to enable holistic optimization across layers.
Enabling Deep Learning on IoT Edge: Approaches and Evaluation
- Computer Science2018 IEEE/ACM Symposium on Edge Computing (SEC)
- 2018
The results show that the deep learning capability on the edge of the IoT can be enabled if applied in an efficient manner on IoT platform equipped with integrated GPU and ARM processor.
Strategies for Re-Training a Pruned Neural Network in an Edge Computing Paradigm
- Computer Science2017 IEEE International Conference on Edge Computing (EDGE)
- 2017
The concept of re-training of pruned networks that should aid personalization of smart devices as well as increase their fault tolerance are introduced and shown that they may obtain a significant improvement on the new data while minimizing the reduction in performance on the original data.