• Publications
  • Influence
Text Generation from Knowledge Graphs with Graph Transformers
TLDR
This work addresses the problem of generating coherent multi-sentence texts from the output of an information extraction system, and in particular a knowledge graph by introducing a novel graph transforming encoder which can leverage the relational structure of such knowledge graphs without imposing linearization or hierarchical constraints. Expand
Parsing Algebraic Word Problems into Equations
This paper formalizes the problem of solving multi-sentence algebraic word problems as that of generating and scoring equation trees. We use integer linear programming to generate equation trees andExpand
MathQA: Towards Interpretable Math Word Problem Solving with Operation-Based Formalisms
TLDR
A large-scale dataset of math word problems and an interpretable neural math problem solver by learning to map problems to their operation programs and a new representation language to model operation programs corresponding to each math problem that aim to improve both the performance and the interpretability of the learned models. Expand
MAWPS: A Math Word Problem Repository
TLDR
MAWPS allows for the automatic construction of datasets with particular characteristics, providing tools for tuning the lexical and template overlap of a dataset as well as for filtering ungrammatical problems from web-sourced corpora. Expand
A Theme-Rewriting Approach for Generating Algebra Word Problems
TLDR
This paper presents a text generation method called It rewriting, which edits existing human-authored narratives to change their theme without changing the underlying story, and applies it to math word problems, where it might help students stay more engaged by quickly transforming all of their homework assignments to the theme of their favorite movie without changes the math concepts that are being taught. Expand
A Controllable Model of Grounded Response Generation
TLDR
Quantitative and qualitative results show that, using this framework, a GPT-2 based model trained on a conversation-like Reddit dataset outperforms strong generation baselines. Expand
DeFINE: DEep Factorized INput Word Embeddings for Neural Sequence Modeling
TLDR
A new method is described, DeFINE, for learning deep word-level representations efficiently, which uses a hierarchical structure with novel skip-connections which allows for the use of low dimensional input and output layers, reducing total parameters and training time while delivering similar or better performance versus existing methods. Expand
Data-Driven Methods for Solving Algebra Word Problems
TLDR
It is shown that well-tuned neural equation classifiers can outperform more sophisticated models such as sequence to sequence and self-attention across these datasets. Expand
Multi-Resolution Language Grounding with Weak Supervision
TLDR
An approach to multi-resolution language grounding in the extremely challenging domain of professional soccer commentaries is introduced and a factored objective function is defined that allows us to leverage discourse structure and the compositional nature of both language and game events. Expand
Citation Text Generation
TLDR
This paper establishes the task of citation text generation with a standard evaluation corpus and develops several strong baseline models for this task, and provides extensive automatic and human evaluations to illustrate the successes and shortcomings of current text generation techniques. Expand
...
1
2
...