Programming with neural surrogates of programs

  title={Programming with neural surrogates of programs},
  author={Alex Renda and Y. Ding and Michael Carbin},
  journal={Proceedings of the 2021 ACM SIGPLAN International Symposium on New Ideas, New Paradigms, and Reflections on Programming and Software},
  • Alex Renda, Y. Ding, Michael Carbin
  • Published 17 October 2021
  • Computer Science
  • Proceedings of the 2021 ACM SIGPLAN International Symposium on New Ideas, New Paradigms, and Reflections on Programming and Software
Surrogates, models that mimic the behavior of programs, form the basis of a variety of development workflows. We study three surrogate-based design patterns, evaluating each in case studies on a large-scale CPU simulator. With surrogate compilation, programmers develop a surrogate that mimics the behavior of a program to deploy to end-users in place of the original program. Surrogate compilation accelerates the CPU simulator under study by 1.6×. With surrogate adaptation, programmers develop a… 

Figures and Tables from this paper


NEUZZ: Efficient Fuzzing with Neural Program Smoothing
A novel program smoothing technique using surrogate neural network models that can incrementally learn smooth approximations of a complex, real-world program's branching behaviors is proposed and used together with gradient-guided input generation schemes to significantly increase the efficiency of the fuzzing process.
PyTorch: An Imperative Style, High-Performance Deep Learning Library
This paper details the principles that drove the implementation of PyTorch and how they are reflected in its architecture, and explains how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance.
Towards Automated Construction of Compiler Optimizations
  • Ph.D. Thesis. Massachusetts Institute of Technology,
  • 2020
Transfer Learning as a Tool for Reducing Simulation Bias: Application to Inertial Confinement Fusion
Numerical tests are carried out to investigate the applicability of transfer learning to predict the observable outcomes of inertial confinement fusion experiments under new conditions and demonstrate that an accurate predictive model can be built by retraining an initial surrogate model with experimental data volumes so small that they are relevant to the ICF problem.
Transfer Learning for Design-Space Exploration with High-Level Synthesis
  • Jihye Kwon, L. Carloni
  • Computer Science
    2020 ACM/IEEE 2nd Workshop on Machine Learning for CAD (MLCAD)
  • 2020
A novel neural network model is developed that outperforms both single-domain and hard-sharing models in predicting the performance and cost at early stages of HLS-driven DSE and reuses the knowledge obtained from previously explored design spaces in exploring a new target design space.
Deep Probabilistic Surrogate Networks for Universal Simulator Approximation
Employing the surrogate modeling technique makes inference an order of magnitude faster, opening up the possibility of doing simulator-based, non-invasive, just-in-time parts quality testing; in this case inferring safety-critical latent internal temperature profiles of composite materials undergoing curing from surface temperature profile measurements.
Imitation-Projected Programmatic Reinforcement Learning
The experiments show that PROPEL can significantly outperform state-of-the-art approaches for learning programmatic policies and exploit contemporary combinatorial methods for this task.
Imitation-Projected Programmatic Reinforcement Learning. In Advances in Neural Information Processing Systems. 37 Programming with Neural Surrogates of Programs Onward! ’21
  • October 20ś22,
  • 2019
Attention is All you Need
A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.