Learning to Execute

Abstract

Recurrent Neural Networks (RNNs) with Long Short-Term Memory units (LSTM) are widely used because they are expressive and are easy to train. Our interest lies in empirically evaluating the expressiveness and the learnability of LSTMs in the sequence-to-sequence regime by training them to evaluate short computer programs, a domain that has traditionally been… (More)
View Slides

Topics

7 Figures and Tables

Statistics

05010020142015201620172018
Citations per Year

235 Citations

Semantic Scholar estimates that this publication has 235 citations based on the available data.

See our FAQ for additional information.

Blog articles referencing this paper

Slides referencing similar topics