# Noisy Machines: Understanding Noisy Neural Networks and Enhancing Robustness to Analog Hardware Errors Using Distillation

@article{Zhou2019NoisyMU, title={Noisy Machines: Understanding Noisy Neural Networks and Enhancing Robustness to Analog Hardware Errors Using Distillation}, author={Chuteng Zhou and Prad Kadambi and Matthew Mattina and Paul N. Whatmough}, journal={ArXiv}, year={2019}, volume={abs/2001.04974} }

The success of deep learning has brought forth a wave of interest in computer hardware design to better meet the high demands of neural network inference. In particular, analog computing hardware has been heavily motivated specifically for accelerating neural networks, based on either electronic, optical or photonic devices, which may well achieve lower power consumption than conventional digital electronics. However, these proposed analog accelerators suffer from the intrinsic noise generated… CONTINUE READING

Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

#### References

##### Publications referenced by this paper.

SHOWING 1-10 OF 34 REFERENCES

## Deep learning with coherent nanophotonic circuits

VIEW 6 EXCERPTS

HIGHLY INFLUENTIAL

## Analog/Mixed-Signal Hardware Error Modeling for Deep Learning Inference

VIEW 3 EXCERPTS

HIGHLY INFLUENTIAL

## On the Information Bottleneck Theory of Deep Learning

VIEW 4 EXCERPTS

HIGHLY INFLUENTIAL

## SGDR: Stochastic Gradient Descent with Warm Restarts

VIEW 1 EXCERPT

HIGHLY INFLUENTIAL

## Distilling the Knowledge in a Neural Network

VIEW 3 EXCERPTS

HIGHLY INFLUENTIAL

## Gradient-based learning applied to document recognition

VIEW 2 EXCERPTS

HIGHLY INFLUENTIAL

## Adversarially Robust Distillation

VIEW 2 EXCERPTS