• Mathematics, Computer Science
  • Published in ICLR 2019

Neural network gradient-based learning of black-box function interfaces

@article{Jacovi2019NeuralNG,
  title={Neural network gradient-based learning of black-box function interfaces},
  author={Alon Jacovi and Guy Hadash and Einat Kermany and Boaz Carmeli and Ofer Lavi and George Kour and Jonathan Berant},
  journal={ArXiv},
  year={2019},
  volume={abs/1901.03995}
}
Deep neural networks work well at approximating complicated functions when provided with data and trained by gradient descent methods. At the same time, there is a vast amount of existing functions that programmatically solve different tasks in a precise manner eliminating the need for training. In many cases, it is possible to decompose a task to a series of functions, of which for some we may prefer to use a neural network to learn the functionality, while for others the preferred method… CONTINUE READING
32
Twitter Mentions

Citations

Publications citing this paper.

References

Publications referenced by this paper.
SHOWING 1-10 OF 33 REFERENCES

Neural Arithmetic Logic Units

VIEW 2 EXCERPTS

Attention is All you Need

VIEW 1 EXCERPT

Inferring and Executing Programs for Visual Reasoning

VIEW 1 EXCERPT

2016) to represent words. A word is a concatenation of word pieces: wj ∈ R is an average value of its piece embedding

  • Wu
  • 2016