Machine Learning at the Network Edge: A Survey

@article{Murshed2019MachineLA,
  title={Machine Learning at the Network Edge: A Survey},
  author={M. G. Sarwar Murshed and Chris Murphy and Daqing Hou and Nazar Khan and Ganesh Ananthanarayanan and Faraz Hussain},
  journal={ACM Computing Surveys (CSUR)},
  year={2019},
  volume={54},
  pages={1 - 37}
}
Resource-constrained IoT devices, such as sensors and actuators, have become ubiquitous in recent years. This has led to the generation of large quantities of data in real-time, which is an appealing target for AI systems. However, deploying machine learning models on such end-devices is nearly impossible. A typical solution involves offloading data to external computing systems (such as cloud servers) for further processing but this worsens latency, leads to increased communication costs, and… 

Figures and Tables from this paper

Multi-Component Optimization and Efficient Deployment of Neural-Networks on Resource-Constrained IoT Hardware

This paper presents an end-to-end multi-component model optimization sequence that can be applied to any state-of-the-art models trained for anomaly detection, predictive maintenance, robotics, voice recognition, and machine vision, and open-source its implementation.

Practice of Applied Edge Analytics in Intelligent Learning Framework

The network of edge computing and its variance from cloud computing, edge architecture, and diverse applications of machine learning algorithms and deep learning framework deployed at the edge network for intelligent analytics are described.

Machine and Deep Learning for Resource Allocation in Multi-Access Edge Computing: A Survey

This paper presents a comprehensive survey of ML/DL-based RA mechanisms in MEC, and provides an in-depth survey of recent works that used ML/ DL methods for RA in M EC from three aspects.

Sparta: Heat-Budget-Based Scheduling Framework on IoT Edge Systems

A heat-budget-based scheduling system, called Sparta, which leverages dynamic voltage and frequency scaling (DVFS) to adaptively control CPU temperature and is able to maintain CPU temperature below the threshold 94% of the time while facilitating improvements in execution time by 1.04x over competitive approaches.

Communication-Efficient Edge AI: Algorithms and Systems

A comprehensive survey of the recent developments in various techniques for overcoming key communication challenges in edge AI systems is presented, and communication-efficient techniques are introduced from both algorithmic and system perspectives for training and inference tasks at the network edge.

LOW LATENCY DEEP LEARNING INFERENCE MODEL FOR DISTRIBUTED INTELLIGENT IOT EDGE CLUSTERS

This work deploys a convolutional neural network (CNN) on resource-constraint IoT devices to make them intelligent and realistic and proposes decentralized heterogeneous edge clusters deployed with an optimized pre-trained yolov2 model.

Machine Learning at Resource Constraint Edge Device Using Bonsai Algorithm

The experiment is conducted with publicly available dataset with Bonsai algorithm, implemented in Linux environment with core is processor in python 2.7 and achieved 92% accuracy with model size of 6.25KB, which can be easily deployed on resource constraint IoT devices.

Making distributed edge machine learning for resource-constrained communities and environments smarter: contexts and challenges

This paper analyzes representative real-world business scenarios for edge ML solutions and their contexts in resource-constrained communities and environments and identifies and map the key distinguished contexts of distributed edge ML.

Pervasive AI for IoT Applications: A Survey on Resource-Efficient Distributed Artificial Intelligence

A comprehensive survey of the recent techniques and strategies developed to overcome resource challenges in pervasive AI systems, and an overview of pervasive computing, its architecture, and its intersection with artificial intelligence.

Machine Learning in Resource-Scarce Embedded Systems, FPGAs, and End-Devices: A Survey

A study about the optimizations, algorithms, and platforms used to implement such models into the network’s end, where highly resource-scarce microcontroller units (MCUs) are found.
...

References

SHOWING 1-10 OF 183 REFERENCES

Federated Learning in Mobile Edge Networks: A Comprehensive Survey

In a large-scale and complex mobile edge network, heterogeneous devices with varying constraints are involved, this raises challenges of communication costs, resource allocation, and privacy and security in the implementation of FL at scale.

Deep Learning at the Edge

One of the most widely used machine learning methods, namely, Deep Learning (DL), is discussed and a short survey on the recent approaches used to map DL onto the edge computing paradigm is offered.

Wireless Network Intelligence at the Edge

In a first of its kind, this article explores the key building blocks of edge ML, different neural network architectural splits and their inherent tradeoffs, as well as theoretical and technical enablers stemming from a wide range of mathematical disciplines.

When Edge Meets Learning: Adaptive Control for Resource-Constrained Distributed Machine Learning

This paper analyzes the convergence rate of distributed gradient descent from a theoretical point of view, and proposes a control algorithm that determines the best trade-off between local update and global parameter aggregation to minimize the loss function under a given resource budget.

Edge Machine Learning: Enabling Smart Internet of Things Applications

A step forward has been taken to understand the feasibility of running machine learning algorithms, both training and inference, on a Raspberry Pi, an embedded version of the Android operating system designed for IoT device development.

DeepThings: Distributed Adaptive Deep Learning Inference on Resource-Constrained IoT Edge Clusters

DeepThings is proposed, a framework for adaptively distributed execution of CNN-based inference applications on tightly resource-constrained IoT edge clusters that employs a scalable Fused Tile Partitioning of convolutional layers to minimize memory footprint while exposing parallelism.

Image classification on IoT edge devices: profiling and modeling

The results indicate that the random forest model outperforms the two former algorithms, with an R-squared value of 0.95 and 0.79, and served as a feature extraction mechanism which enabled us to identify which predictor variables influenced the authors' model the most.

Fully Distributed Deep Learning Inference on Resource-Constrained Edge Devices

This paper jointly optimize memory, computation and communication demands for distributed execution of complete neural networks covering all layers through techniques that combine both feature and weight partitioning with a communication-aware layer fusion approach to enable holistic optimization across layers.

Enabling Deep Learning on IoT Edge: Approaches and Evaluation

  • Xuan QiChen Liu
  • Computer Science
    2018 IEEE/ACM Symposium on Edge Computing (SEC)
  • 2018
The results show that the deep learning capability on the edge of the IoT can be enabled if applied in an efficient manner on IoT platform equipped with integrated GPU and ARM processor.

Strategies for Re-Training a Pruned Neural Network in an Edge Computing Paradigm

The concept of re-training of pruned networks that should aid personalization of smart devices as well as increase their fault tolerance are introduced and shown that they may obtain a significant improvement on the new data while minimizing the reduction in performance on the original data.
...