• Publications
  • Influence
Gated Graph Sequence Neural Networks
TLDR
This work studies feature learning techniques for graph-structured inputs and achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.
Learning to Represent Programs with Graphs
TLDR
This work proposes to use graphs to represent both the syntactic and semantic structure of code and use graph-based deep learning methods to learn to reason over program structures, and suggests that these models learn to infer meaningful names and to solve the VarMisuse task in many cases.
DeepCoder: Learning to Write Programs
TLDR
The approach is to train a neural network to predict properties of the program that generated the outputs from the inputs to augment search techniques from the programming languages community, including enumerative search and an SMT-based solver.
CodeSearchNet Challenge: Evaluating the State of Semantic Code Search
TLDR
The methodology used to obtain the corpus and expert labels, as well as a number of simple baseline solutions for the task are described.
Constrained Graph Variational Autoencoders for Molecule Design
TLDR
A variational autoencoder model in which both encoder and decoder are graph-structured is proposed and it is shown that by using appropriate shaping of the latent space, this model allows us to design molecules that are (locally) optimal in desired properties.
Structured Neural Summarization
TLDR
This work develops a framework to extend existing sequence encoders with a graph component that can reason about long-distance relationships in weakly structured data such as text and shows that the resulting hybrid sequence-graph models outperform both pure sequence models as well as pure graph models on a range of summarization tasks.
Alternating Runtime and Size Complexity Analysis of Integer Programs
TLDR
A novel alternation between finding symbolic time bounds for program parts and using these to infer size bounds on program variables, which can restrict each analysis step to a small part of the program while maintaining a high level of precision is presented.
Analyzing Program Termination and Complexity Automatically with AProVE
TLDR
The tool AProVE is presented for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems, and a corresponding plug-in for the popular Eclipse software development environment is presented.
GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation
TLDR
This paper presents a new Graph Neural Network type using feature-wise linear modulation (FiLM), which outperforms baseline methods on a regression task on molecular graphs and performs competitively on other tasks.
Learning to Represent Edits
We introduce the problem of learning distributed representations of edits. By combining a "neural editor" with an "edit encoder", our models learn to represent the salient information of an edit and
...
1
2
3
4
5
...