Corpus ID: 2170930

DyNet: The Dynamic Neural Network Toolkit

@article{Neubig2017DyNetTD,
  title={DyNet: The Dynamic Neural Network Toolkit},
  author={Graham Neubig and Chris Dyer and Yoav Goldberg and Austin Matthews and Waleed Ammar and Antonios Anastasopoulos and Miguel Ballesteros and David Chiang and Daniel Clothiaux and Trevor Cohn and Kevin Duh and Manaal Faruqui and Cynthia Gan and Dan Garrette and Yangfeng Ji and Lingpeng Kong and Adhiguna Kuncoro and Manish Kumar and Chaitanya Malaviya and Paul Michel and Yusuke Oda and Matthew Richardson and Naomi Saphra and Swabha Swayamdipta and Pengcheng Yin},
  journal={ArXiv},
  year={2017},
  volume={abs/1701.03980}
}
  • Graham Neubig, Chris Dyer, +22 authors Pengcheng Yin
  • Published 2017
  • Computer Science, Mathematics
  • ArXiv
  • We describe DyNet, a toolkit for implementing neural network models based on dynamic declaration of network structure. [...] Key Result Experiments show that DyNet's speeds are faster than or comparable with static declaration toolkits, and significantly faster than Chainer, another dynamic declaration toolkit. DyNet is released open-source under the Apache 2.0 license and available at this http URLExpand Abstract

    Figures, Tables, and Topics from this paper.

    Explore key concepts

    Links to highly relevant papers for key concepts in this paper:

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 293 CITATIONS

    Cavs: An Efficient Runtime System for Dynamic Neural Networks

    VIEW 10 EXCERPTS
    CITES BACKGROUND & METHODS

    Cavs: A Vertex-centric Programming Interface for Dynamic Neural Networks

    VIEW 10 EXCERPTS
    CITES METHODS & BACKGROUND

    On-the-fly Operation Batching in Dynamic Computation Graphs

    VIEW 14 EXCERPTS
    CITES METHODS

    Improving the expressiveness of deep learning frameworks with recursion

    VIEW 4 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    In-Register Parameter Caching for Dynamic Neural Nets with Virtual Persistent Processor Specialization

    VIEW 10 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    On Machine Learning and Programming Languages

    VIEW 1 EXCERPT
    CITES METHODS

    Jointly Learning Sentence Embeddings and Syntax with Unsupervised Tree-LSTMs

    VIEW 3 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    FILTER CITATIONS BY YEAR

    2015
    2020

    CITATION STATISTICS

    • 25 Highly Influenced Citations

    • Averaged 74 Citations per year from 2018 through 2020

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 73 REFERENCES

    Deep Learning with Dynamic Computation Graphs

    VIEW 3 EXCERPTS
    HIGHLY INFLUENTIAL

    Theano: A CPU and GPU Math Compiler in Python

    VIEW 10 EXCERPTS
    HIGHLY INFLUENTIAL

    MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL