Corpus ID: 202786778

PyTorch: An Imperative Style, High-Performance Deep Learning Library

@inproceedings{Paszke2019PyTorchAI,
  title={PyTorch: An Imperative Style, High-Performance Deep Learning Library},
  author={Adam Paszke and S. Gross and Francisco Massa and A. Lerer and J. Bradbury and G. Chanan and T. Killeen and Z. Lin and N. Gimelshein and L. Antiga and Alban Desmaison and Andreas K{\"o}pf and E. Yang and Zach DeVito and Martin Raison and Alykhan Tejani and Sasank Chilamkurthy and B. Steiner and Lu Fang and Junjie Bai and Soumith Chintala},
  booktitle={NeurIPS},
  year={2019}
}
  • Adam Paszke, S. Gross, +18 authors Soumith Chintala
  • Published in NeurIPS 2019
  • Computer Science, Mathematics
  • Deep learning frameworks have often focused on either usability or speed, but not both. PyTorch is a machine learning library that shows that these two goals are in fact compatible: it was designed from first principles to support an imperative and Pythonic programming style that supports code as a model, makes debugging easy and is consistent with other popular scientific computing libraries, while remaining efficient and supporting hardware accelerators such as GPUs. In this paper, we detail… CONTINUE READING
    3,438 Citations

    Figures, Tables, and Topics from this paper

    Explore Further: Topics Discussed in This Paper

    Efficient Execution of Quantized Deep Learning Models: A Compiler Approach
    • 3
    • Highly Influenced
    • PDF
    Flexible Performant GEMM Kernels on GPUs
    • PDF
    Scaling Distributed Deep Learning Workloads beyond the Memory Capacity with KARMA
    • PDF
    Transparent acceleration of Java-based deep learning engines
    SOL: Effortless Device Support for AI Frameworks without Source Code Changes
    • Nicolas Weber, Felipe Huici
    • Computer Science
    • 2020 20th IEEE/ACM International Symposium on Cluster, Cloud and Internet Computing (CCGRID)
    • 2020
    • PDF
    Towards a Scalable and Distributed Infrastructure for Deep Learning Applications
    • PDF
    Tonic: A Deep Reinforcement Learning Library for Fast Prototyping and Benchmarking
    • Highly Influenced
    • PDF
    VirtualFlow: Decoupling Deep Learning Model Execution from Underlying Hardware
    • PDF
    NASLib: A Modular and Flexible Neural Architecture Search Library
    • 1
    • Highly Influenced

    References

    SHOWING 1-10 OF 29 REFERENCES
    cuDNN: Efficient Primitives for Deep Learning
    • 1,132
    • PDF
    Automatic differentiation in PyTorch
    • 7,389
    Theano: A Python framework for fast computation of mathematical expressions
    • 1,872
    • PDF
    Hogwild: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent
    • 1,766
    • PDF
    Caffe: Convolutional Architecture for Fast Feature Embedding
    • 12,491
    • Highly Influential
    • PDF
    CNTK: Microsoft's Open-Source Deep-Learning Toolkit
    • 279
    Torch7: A Matlab-like Environment for Machine Learning
    • 1,400
    • PDF
    DyNet: The Dynamic Neural Network Toolkit
    • 325
    • Highly Influential
    • PDF
    maxDNN: An Efficient Convolution Kernel for Deep Learning with Maxwell GPUs
    • 44
    • PDF
    Hoard: a scalable memory allocator for multithreaded applications
    • 500
    • PDF