Corpus ID: 199552070

Exploiting Parallelism Opportunities with Deep Learning Frameworks

@article{Wang2019ExploitingPO,
  title={Exploiting Parallelism Opportunities with Deep Learning Frameworks},
  author={Yu Emma Wang and Carole-Jean Wu and Xiu Qiao Wang and Kim Hazelwood and David Brooks},
  journal={ArXiv},
  year={2019},
  volume={abs/1908.04705}
}
  • Yu Emma Wang, Carole-Jean Wu, +2 authors David Brooks
  • Published 2019
  • Computer Science, Mathematics
  • ArXiv
  • State-of-the-art machine learning frameworks support a wide variety of design features to enable a flexible machine learning programming interface and to ease the programmability burden on machine learning developers. Identifying and using a performance-optimal setting in feature-rich frameworks, however, involves a non-trivial amount of performance characterization and domain-specific knowledge. This paper takes a deep dive into analyzing the performance impact of key design features and the… CONTINUE READING

    Figures and Topics from this paper.

    Explore key concepts

    Links to highly relevant papers for key concepts in this paper:

    Citations

    Publications citing this paper.
    SHOWING 1-2 OF 2 CITATIONS

    SMAUG: End-to-End Full-Stack Simulation Infrastructure for Deep Learning Workloads

    VIEW 4 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 41 REFERENCES

    Applied Machine Learning at Facebook: A Datacenter Infrastructure Perspective

    VIEW 3 EXCERPTS