• Publications
  • Influence
Never-Ending Learning
TLDR
The Never-Ending Language Learner is described, which achieves some of the desired properties of a never-ending learner, and lessons learned are discussed. Expand
Learning a Neural Semantic Parser from User Feedback
We present an approach to rapidly and easily build natural language interfaces to databases for new domains, whose performance improves over time based on user feedback, and requires minimalExpand
Neural Semantic Parsing with Type Constraints for Semi-Structured Tables
TLDR
A new semantic parsing model for answering compositional questions on semi-structured Wikipedia tables with a state-of-the-art accuracy and type constraints and entity linking are valuable components to incorporate in neural semantic parsers. Expand
Incorporating Vector Space Similarity in Random Walk Inference over Knowledge Bases
TLDR
A new technique for combining KB relations and surface text into a single graph representation that is much more compact than graphs used in prior work is presented, and how to incorporate vector space similarity into random walk inference over KBs is described. Expand
Jointly Learning to Parse and Perceive: Connecting Natural Language to the Physical World
TLDR
This paper introduces Logical Semantics with Perception (LSP), a model for grounded language acquisition that learns to map natural language statements to their referents in a physical environment and finds that LSP outperforms existing, less expressive models that cannot represent relational language. Expand
Weakly Supervised Training of Semantic Parsers
TLDR
This work presents a method for training a semantic parser using only a knowledge base and an unlabeled text corpus, without any individually annotated sentences, and demonstrates recovery of this richer structure by extracting logical forms from natural language queries against Freebase. Expand
Instructable Intelligent Personal Agent
TLDR
It is demonstrated that users voluntarily teach LIA new commands, and that these taught commands significantly reduce task completion time, demonstrating the potential of natural language instruction as a significant, under-explored paradigm for machine learning. Expand
Semantic Parsing to Probabilistic Programs for Situated Question Answering
TLDR
P3 is presented, a novel situated question answering model that can use background knowledge and global features of the question/environment interpretation while retaining efficient approximate inference, to treat semantic parses as probabilistic programs that execute nondeterministically and whose possible executions represent environmental uncertainty. Expand
Vector Space Semantic Parsing: A Framework for Compositional Vector Space Models
We present vector space semantic parsing (VSSP), a framework for learning compositional models of vector space semantics. Our framework uses Combinatory Categorial Grammar (CCG) to define aExpand
Structured Set Matching Networks for One-Shot Part Labeling
TLDR
The Structured Set Matching Network (SSMN), a structured prediction model that incorporates convolutional neural networks, is introduced for the problem of one-shot part labeling: labeling multiple parts of an object in a target image given only a single source image of that category. Expand
...
1
2
3
...