• Publications
  • Influence
Automatic differentiation in PyTorch
TLDR
In this article, we describe an automatic differentiation module of PyTorch — a library designed to enable rapid research on machine learning models. Expand
  • 7,404
  • 881
PyTorch: An Imperative Style, High-Performance Deep Learning Library
TLDR
This paper introduces PyTorch, a Python library that performs immediate execution of dynamic tensor computations with automatic differentiation and GPU acceleration, and does so while maintaining performance comparable to the fastest current libraries for deep learning. Expand
  • 3,478
  • 389
  • PDF
ENet: A Deep Neural Network Architecture for Real-Time Semantic Segmentation
TLDR
In this paper, we propose a novel deep neural network architecture named ENet (efficient neural network), created specifically for tasks requiring low latency operation. Expand
  • 830
  • 184
  • PDF
An Analysis of Deep Neural Network Models for Practical Applications
TLDR
We present a comprehensive analysis of important metrics in practical applications: accuracy, memory footprint, parameters, operations count, inference time and power consumption of state-of-the-art DNNs. Expand
  • 591
  • 35
  • PDF
Evaluation of neural network architectures for embedded systems
TLDR
We present a comprehensive analysis of important metrics in practical applications: accuracy, memory footprint, parameters, operations count, inference time and power consumption. Expand
  • 23
  • 4
PyTorch Distributed: Experiences on Accelerating Data Parallel Training
TLDR
This paper presents the design, implementation, and evaluation of the PyTorch distributed data parallel module. Expand
  • 11
  • 2
  • PDF
PyTorch distributed
TLDR
We present the design, implementation, and evaluation of the distributed data parallel module in PyTorch v1.5, which achieves near-linear scalability using 256 GPUs. Expand
VC density of set systems defnable in tree-like graphs
TLDR
We study set systems definable in graphs using variants of logic with different expressive power. Expand