Stochastic rounding and reduced-precision fixed-point arithmetic for solving neural ordinary differential equations

@article{Hopkins2020StochasticRA,
  title={Stochastic rounding and reduced-precision fixed-point arithmetic for solving neural ordinary differential equations},
  author={Michael Hopkins and Mantas Mikaitis and Dave R. Lester and Stephen B. Furber},
  journal={Philosophical transactions. Series A, Mathematical, physical, and engineering sciences},
  year={2020},
  volume={378}
}
  • Michael Hopkins, Mantas Mikaitis, +1 author Stephen B. Furber
  • Published 2020
  • Mathematics, Physics, Medicine, Computer Science
  • Philosophical transactions. Series A, Mathematical, physical, and engineering sciences
  • Although double-precision floating-point arithmetic currently dominates high-performance computing, there is increasing interest in smaller and simpler arithmetic types. The main reasons are potential improvements in energy efficiency and memory footprint and bandwidth. However, simply switching to lower-precision types typically results in increased numerical errors. We investigate approaches to improving the accuracy of reduced-precision fixed-point arithmetic types, using examples in an… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Explore key concepts

    Links to highly relevant papers for key concepts in this paper:

    Citations

    Publications citing this paper.