We propose a novel probabilistic model for graph generation that builds gated graph neural networks into the encoder and decoder of a variational autoencoder.Expand

Generative models for source code are an interesting structured prediction problem, requiring to reason about both hard syntactic and semantic constraints as well as about natural, likely programs.Expand

We introduce a novel deterministic method to approximate moments in neural networks, eliminating gradient variance; second, we introduce a hierarchical prior for parameters and a novel Empirical Bayes procedure for automatically selecting prior variances.Expand

We have observed the Bose-Einstein condensation of an atomic gas in the (quasi)uniform three-dimensional potential of an optical box trap. Condensation is seen in the bimodal momentum distribution… Expand

We study machine learning formulations of inductive program synthesis; given input-output examples, we try to synthesize source code that maps inputs to corresponding outputs.Expand

We develop a framework for combining differentiable programming languages with neural networks to create end-to-end trainable systems that learn to write interpretable algorithms with perceptual components.Expand

We have formulated and experimentally demonstrated an improved algorithm for design of arbitrary two-dimensional holographic traps for ultracold atoms.Expand

We introduce a novel deterministic method to approximate moments in neural networks, eliminating gradient variance; second, we introduce a hierarchical prior for parameters and a novel empirical Bayes procedure for automatically selecting prior variances.Expand

Breaking the symmetry in an atomic gas Cooling a physical system through a phase transition typically makes it less symmetrical. If the cooling is done very slowly, this symmetry change is uniform… Expand