Share This Author
Semantic Parsing on Freebase from Question-Answer Pairs
This paper trains a semantic parser that scales up to Freebase and outperforms their state-of-the-art parser on the dataset of Cai and Yates (2013), despite not having annotated logical forms.
CommonsenseQA: A Question Answering Challenge Targeting Commonsense Knowledge
This work presents CommonsenseQA: a challenging new dataset for commonsense question answering, which extracts from ConceptNet multiple target concepts that have the same semantic relation to a single source concept.
Building a Semantic Parser Overnight
A new methodology is introduced that uses a simple grammar to generate logical forms paired with canonical utterances that are meant to cover the desired set of compositional operators and uses crowdsourcing to paraphrase these canonical utterance into natural utterances.
The Web as a Knowledge-Base for Answering Complex Questions
This paper proposes to decompose complex questions into a sequence of simple questions, and compute the final answer from the sequence of answers, and empirically demonstrates that question decomposition improves performance from 20.8 precision@1 to 27.5 precision @1 on this new dataset.
Semantic Parsing via Paraphrasing
This paper presents two simple paraphrase models, an association model and a vector space model, and trains them jointly from question-answer pairs, improving state-of-the-art accuracies on two recently released question-answering datasets.
Neural Symbolic Machines: Learning Semantic Parsers on Freebase with Weak Supervision
- Chen Liang, Jonathan Berant, Quoc V. Le, Kenneth D. Forbus, N. Lao
- Computer ScienceACL
- 31 October 2016
A Neural Symbolic Machine is introduced, which contains a neural “programmer” that maps language utterances to programs and utilizes a key-variable memory to handle compositionality, and a symbolic “computer”, i.e., a Lisp interpreter that performs program execution, and helps find good programs by pruning the search space.
Representing Schema Structure with Graph Neural Networks for Text-to-SQL Parsing
This paper presents an encoder-decoder semantic parser, where the structure of the DB schema is encoded with a graph neural network, and this representation is later used at both encoding and decoding time.
Global Learning of Typed Entailment Rules
The results show that using global transitivity information substantially improves performance over this resource and several baselines, and that the scaling methods allow us to increase the scope of global learning of entailment-rule graphs.
oLMpics-On What Language Model Pre-training Captures
- Alon Talmor, Yanai Elazar, Yoav Goldberg, Jonathan Berant
- Computer ScienceTransactions of the Association for Computational…
- 31 December 2019
This work proposes eight reasoning tasks, which conceptually require operations such as comparison, conjunction, and composition, and findings can help future work on designing new datasets, models, and objective functions for pre-training.
Text Segmentation as a Supervised Learning Task
- Omri Koshorek, Adir Cohen, Noam Mor, Michael Rotman, Jonathan Berant
- Computer ScienceNAACL
- 25 March 2018
This work forms text segmentation as a supervised learning problem, and presents a large new dataset for text segmentations that is automatically extracted and labeled from Wikipedia, and develops a segmentation model that generalizes well to unseen natural text.