• Publications
  • Influence
In Search of Robust Measures of Generalization
This work addresses the question of how to evaluate generalization bounds empirically and argues that generalization measures should instead be evaluated within the framework of distributional robustness.
Pretraining Representations for Data-Efficient Reinforcement Learning
This work uses unlabeled data to pretrain an encoder which is then finetuned on a small amount of task-specific data, and employs a combination of latent dynamics modelling and unsupervised goal-conditioned RL to encourage learning representations which capture diverse aspects of the underlying MDP.
Evaluating the Text-to-SQL Capabilities of Large Language Models
It is demonstrated on the GeoQuery and Scholar benchmarks that a small number of in-domain examples provided in the prompt enables Codex to perform better than state-of-the-art models finetuned on such few-shot examples.
Myriad: a real-world testbed to bridge trajectory optimization and deep learning
The Myriad model leverages an implicit planning module over neural ordinary differential equations, enabling simultaneous learning and planning with complex environment dynamics, and the Myriad repository is used to showcase a novel approach for learning and control tasks.